As we are very well aware that the analysis of the algorithms is done by the calculating its time complexity and space complexity. Talking about the time complexity which is represented by asymptotic notations such as Big-Oh, Theta and Omega notations. This is kind of analysis of algorithms is often called as asymptotic analysis of algorithms. Basically using these notations we try to find the rate of growth of algorithms when the input size n increases say to infinity. We try to find how the time complexity will behave when the input to an algorithm increases.
Here I am going to try explaining the mathematical significance of all three notations as mentioned above. Although the mathematical formulae are same as you can find elsewhere, but I've tried to put the explanation in my own words. If you find any ambiguity in the text, please do let me know, I would really like to discuss it and surely enhance my understanding.
Please if you find anything wrong with this post, do leave a comment, I’ll try to understand this topic in more depth.