Multithreading: A single core can switch between different threads rapidly, creating the illusion of simultaneous execution (concurrency). This is achieved through time slicing, where the core allocates small time slots to each thread. This is useful for improving responsiveness (e.g., keeping a UI from freezing) and handling I/O-bound tasks (waiting for disk or network operations), as one thread can wait while another executes.
Parallel Processing: This involves using multiple cores (or even multiple processors) to truly execute different parts of a program at the same time. This is essential for CPU-bound tasks that can be broken down into independent subtasks, significantly reducing overall execution time.
A) This is the opposite of the truth. Both parallel processing and multithreading typically operate within a single process. Distributing tasks across multiple processes is a multiprocessing, a different concept.
B) Multithreading does not ensure sequential execution; it allows for concurrent execution. Parallel processing enables simultaneous execution.
C) This is also incorrect. Multithreading is often used to improve performance of I/O-bound tasks, while parallel processing is essential for speeding up CPU-bound tasks.