Big O notation is a mathematical notation used to classify algorithms according to their run time or space requirements as the input size grows. It is often used in analytic number theory to express a bound on the difference between an arithmetical function and a better understood approximation. It is also used to provide similar estimates in other fields, and can characterize functions according to their growth rates.
Stanford University
Winter 2023
This course helps transition from coding to problem-solving using computers. The course explores techniques, tools, and models for problem-solving across disciplines using C++. Prior programming experience is assumed.
No concepts data
+ 33 more conceptsStanford University
Autumn 2022-2023
Stanford's CS 221 course teaches foundational principles and practical implementation of AI systems. It covers machine learning, game playing, constraint satisfaction, graphical models, and logic. A rigorous course requiring solid foundational skills in programming, math, and probability.
No concepts data
+ 88 more conceptsUC Berkeley
Spring 2020
This is an introductory course to computer science theory, exploring the design and analysis of various algorithms, number theory, and complexity. The prerequisites include familiarity with mathematical induction, big-O notation, basic data structures, and programming in a standard language.
No concepts data
+ 36 more conceptsBrown University
Fall 2022
CS0150 introduces Computer Science via object-oriented design and programming using Java and JavaFX for creating interactive programs with GUIs. Concepts like data structures, algorithms, and computational efficiency are explored. Practical exercises include engaging programming assignments like Doodle Jump and Tetris. The course is designed for all, requiring no prior knowledge.
No concepts data
+ 19 more concepts