The Dual Forces of Concurrency: Volatile & Synchronized

Ankit Wasankar
2 min readJun 30, 2023

In multithreading and concurrent programming, two commonly used terms that often confuse developers are “synchronized” and “volatile.” While they might appear to be similar in nature, they serve distinct purposes in ensuring thread safety (synchronized) and memory visibility (volatile).

In this article, we will shed light on the differences between synchronized and volatile keywords and understand their unique roles in concurrent programming.

Check out the detailed video presentation with informative visualizations.

#java #javainterviewquestions #interviewquestions #synchronized #volatile

Volatile works with visibility:

When a processor fetches a value from RAM, it typically stores it in its own cache for faster access. However, this caching behavior can lead to a problem known as “visibility” in concurrent programming. Other processors may not immediately see the updated value in their local cache, causing inconsistencies and bugs.

This is where the volatile keyword comes in. When a variable is declared as volatile, it ensures that the most up-to-date value of that variable is always directly read from and written to the main memory, bypassing the local caches.

This guarantees that all processors see the latest value, avoiding visibility issues due to caching and ensuring consistency across threads.

Synchronized works with locks:

When a block of code or a method is declared as synchronized, JVM ensures that only one thread can execute it at a time.

Under the hood, synchronized uses a lock to achieve mutual exclusion. When a thread encounters a synchronized block or method, it attempts to acquire the lock associated with that block or method. If the lock is available, the thread proceeds to execute the synchronized code. If the lock is already held by another thread, the current thread enters a waiting state until the lock becomes available.

Once a thread completes executing the synchronized block, it releases the lock, allowing other waiting threads to acquire it. This ensures that only one thread can execute the synchronized code at any given time, preventing race conditions and maintaining thread safety.

Note: When a thread releases a lock, it flushes any modifications made to shared variables to main memory, ensuring that the changes are visible to other threads. Similarly, when a thread acquires a lock, it fetches the most up-to-date values of shared variables from main memory into its local cache. So we can safely say that everything under synchronized block is already volatile and threads will always see latest values.


Volatile is used to guarantee visibility, ensuring that the most recent value of a variable is seen by all threads.

Synchronized, on the other hand, provides mutual exclusion, allowing only one thread to access a synchronized block at a time, ensuring thread safety and controlling the order of execution.

Representational thumbnail from YouTube video —

Thumbnail for YouTube video —