The memory hierarchy is a way of organizing computer storage based on response time, complexity, and capacity. It affects performance in computer architecture design, algorithm predictions, and programming constructs. It consists of four major storage levels, but other structures can be used depending on the application.
UC Berkeley
Spring 2020
The course addresses programming parallel computers to solve complex scientific and engineering problems. It covers an array of parallelization strategies for numerical simulation, data analysis, and machine learning, and provides experience with popular parallel programming tools.
No concepts data
+ 36 more conceptsCarnegie Mellon University
Fall 2019
This course provides a deep dive into the inner workings of computer systems, enhancing students' effectiveness as programmers. Topics span machine-level code, performance evaluation, computer arithmetic, memory management, and networking protocols. It serves as a foundation for advanced courses like compilers and operating systems.
No concepts data
+ 22 more conceptsWellesley College
Spring 2023
This course explores the inner workings of computers, focusing on how they execute programs. Students gain an in-depth understanding of software and hardware abstractions, ranging from programming languages to transistors. Key areas covered include computational building blocks, hardware-software interfaces, data representation, and practical system abstractions. The course also emphasizes structured reasoning about program execution and promotes skills for independent learning, critical thinking, and problem-solving in computer science.
No concepts data
+ 25 more concepts