Skip to main content
  1. Posts/

OSTEP Chapter 2: Introduction to Operating System Review

·534 words·3 mins

Operating systems are so useful because they handle the following issues for us:

  1. Virtualization of CPU and memory resources
  2. Managing concurrency
  3. Data storage

Heptabase Notes

Chapter 1 is Dialogue, we skip.

Operating Systems #

Operating systems make computers easy to use and efficient by virtualizing resources.

Purpose #

To increase the efficiency and availability of resources while minimizing the operating system’s utilization of resources and ensuring that there is no interference between the operating system and applications.

Historical Development #

Initially, operating systems were like tool sheds with many useful tools, where sometimes you could even store your own interesting things (e.g., writing documents into a file).

As time progressed, many people began sharing this tool shed, and we didn’t want others to touch our stuff.

Thus, we developed the system call. You can think of it as the manager of the tool shed, where now you must go through it to place or retrieve items. It ensures that everyone can only access their own things.

As time further progressed, operating systems developed (Multiprogramming), allowing multiple people to use the tool shed simultaneously.

Three Major Topics #

Virtualization #

CPU #

Q. How can a single CPU be used by multiple programs?
A. By virtualizing the CPU to be shared by multiple programs.

This brings another question:

Q. If two programs want to use the CPU at the same time, which one gets executed first?
A. This is determined by the operating system’s policy.

Memory #

Q. How can memory be accessed by multiple programs simultaneously?
A. By virtualizing memory so that multiple programs can share it.

This leads to a problem: what if two programs access the same memory location?

For example, by virtualizing memory, two different running programs each have their own virtual address space, and each believes they have independent memory.

However, both programs could potentially access data at the same memory location.

Concurrency #

What is a concurrency issue?

#include <stdio.h>
#include <stdlib.h>
#include "common_threads.h"

volatile int counter = 0; 
int loops;

void *worker(void *arg) {
    int i;
    for (i = 0; i < loops; i++) {
	    counter++;
    }
    return NULL;
}

int main(int argc, char *argv[]) {
    if (argc != 2) { 
	fprintf(stderr, "usage: threads <loops>\n"); 
	exit(1); 
    } 
    loops = atoi(argv[1]);
    pthread_t p1, p2;
    printf("Initial value : %d\n", counter);
    Pthread_create(&p1, NULL, worker, NULL); 
    Pthread_create(&p2, NULL, worker, NULL);
    Pthread_join(p1, NULL);
    Pthread_join(p2, NULL);
    printf("Final value   : %d\n", counter);
    return 0;
}
prompt> ./thread 100000
Initial value : 0
Final value : 143012 // huh??
prompt> ./thread 100000
Initial value : 0
Final value : 137298 // what the??

From the code above, we would expect the final value to be 200000, but it almost never is.

The main reason is that the operation counter++ is not atomic, meaning it’s not completed in one go.

It involves at least reading the value from memory -> modifying the value -> writing it back to memory

Key Factors Causing Concurrency #

  1. Multiple threads
  2. Shared memory

Permanent Storage #

Q. Why is long-term storage of information important?

A. Because computer memory, known as Random Access Memory, is a short-term storage unit that allows for quick access, but data is lost when power is lost. Therefore, transferring important information to long-term storage units is crucial.