Big-O notation is used in software engineering as a way to measure a function’s growth and was introduced in 1894 by Paul Bachmann. Big-O notation has some properties that are helpful when estimating the general efficiency of various algorithms and can be used to compare algorithms.Data Structures and Algorithms for Game Developers: Allen Sherrod
Found this article interesting? Follow Brightwhiz on Facebook, Twitter, and YouTube to read and watch more content we post.