Transactional memory is a concurrency control mechanism in computer science that allows a group of load and store instructions to execute atomically, similar to database transactions. It provides a high-level abstraction for coordinating concurrent reads and writes of shared data in parallel systems, serving as an alternative to low-level thread synchronization.
Stanford University
Fall 2022
Focused on principles and trade-offs in designing modern parallel computing systems, this course also teaches parallel programming techniques. It is intended for students looking to understand both parallel hardware and software design. Prerequisite knowledge in computer systems is required.
No concepts data
+ 45 more concepts