Picture a popular fast-food chain that also offers a “drive-thru” option. Customers line up in their cars to order, proceed to a pick-up point to receive their orders, and then leave. New customers join from the rear and exit in the order they arrive.
In Java, the Queue data structure works in a similar way — it follows the first-in-first-out (FIFO) principle. The Java Queue interface enables Java applications to hold ordered objects (or elements) while providing additional operations, including insertion, deletion, and inspection — all of which adhere to the FIFO rule.
The Queue interface also has robust implementing classes, including PriorityQueue, LinkedList, DelayQueue, ArrayBlockingQueue, and PriorityBlockingQueue.
In this guide, we’ll explore the most important aspects of the Queue interface, its core methods, detailed examples of queue operations, and thread-safety handling using BlockingQueues. Let’s dive in.
How to Use the Java Queue Interface
The Queue interface is a member of the java.util.* package, which extends the Collection interface — the root interface of the Java Collections framework.
Since this interface doesn’t have a direct implementation, subinterfaces like the List, Set, and Queue provide its implementation. Consequently, the Queue contains its own methods and those of the Collection interface.
Since the Queue is an interface, it can’t be instantiated. Therefore, it’s implemented by three main classes: the LinkedList, PriorityQueue, and ArrayDeque.
Queue Interface: Common Methods
You can define the Queue interface and its signature methods as follows, where E is the element type contained in the queue:
Here are the key operations that the methods above perform on queues:
- element returns the queue’s head (leading) element. If the queue is empty, this method throws an exception.
- offer (E e) inserts a new element into the queue and returns a boolean value depending on the outcome of the operation. The insertion is successful only if the operation does not violate any queue capacity restrictions.
- peek also returns the head element of the queue. While similar to element, the difference between the two is that peek returns null, while the former throws an exception if the queue is empty.
- poll: a dual-operation method that simultaneously retrieves and removes the queue’s head element. If the queue is empty, this method returns null.
- remove: also removes and returns the head element of a queue but throws an exception if the data structure is empty
- add (E e): a common method that adds an element into the queue only if the operation does not violate the structure’s capacity limit. The method returns true if the insertion is successful and throws an IllegalStateException if the queue has no space to accommodate a new element.
Queue Interface Implementation
Next, let’s cover two classes to implement the queue interface, with examples: LinkedList and PriorityQueue.
The Java LinkedList class belongs to the collections framework and utilizes the doubly linked list principle to store items in the data structure. Instead of storing elements in contiguous memory locations, a LinkedList store uses nodes to hold data fields. Each node references the next one on the list. A node is characterized by three fields:
- Prev stores the previous element’s address within the data structure and is always null for the first element.
- Next holds the address to the next element in the list. For the last element in the list, this field is null.
- Data the data held by the node.
The example below create a queue of the LinkedList type to hold even numbers between 0 and 10. The LinkedList implements the methods explained earlier to demonstrate common operations on the data structure.
A common use case for LinkedList for developers is designing website navigation. The structure allows the linking of web resources together, enabling forward and backward navigation across browser windows.
Recall that a regular queue is a first-in-first-out (FIFO) structure — otems are processed in the order they come in. However, there are cases when some processes within a queue have a higher priority than others.
Suppose that a group of individuals arrives for the annual charity marathon in your local town square and queue by their order of arrival (first-in). Before starting the race, the participants must check in by confirming their registration against a manual register with the race organizers.
However, the manual register lists participants in ascending order of their first names. Processing participants may require re-ordering the queue to match the naming criteria, resulting in a priority queue.
While the elements entering a priority queue are not necessarily sorted, they are always retrieved in sorted order. The PriorityQueue class contains a comparator method that specifies the ordering comparator. If none is provided, the queue applies natural ordering — alphabetical or numerical — on its elements.
The example below uses a priority queue to sort fruits added to a queue. The priority is ascending order of the fruits’ names.
Note that in the example above, the PriorityQueue doesn’t have a comparator argument. Therefore, the queue orders the “fruit” elements alphabetically.
You can apply priority queues in operating system processes to trigger high-priority processes, including the previsional-inactive ones, typically stored at the bottom of the queue.
Queues and Thread Safety
A common issue affecting LinkedList and PriorityQueue implementations is that they’re not thread-safe. While the shareability of Queues makes them great for multi-threaded processes, multiple threads attempting to perform read operations on a single queue may cause resource contention and reduced write speeds.
The BlockingQueue interface mitigates this problem by providing methods that force threads to defer operations based on the queue’s state. By implementing a blocking queue, write operations wait until the queue has available space while read operations wait for a non-empty queue before attempting element retrieval or removal.
Since BlockingQueue is an interface, it requires classes to implement it. ArrayBlockingQueue and LinkedBlockingQueue are classes that implement the interface.
The BlockingQueue interface contains both the typical Queue implementation methods and two additional ones for added operations: put and take.
The put method supports multi-threaded processes by waiting until there is space in the array blocking queue before performing an element insertion. The take method waits until there are elements in the data structure before deleting or returning items from it.
BlockingQueue Implementation Using ArrayBlockingQueue
In this example, a BlockingQueue is implemented by the ArrayBlockingQueue class. The class contains a producer thread, which inserts even-numbered elements into the queue, and a consumer thread that retrieves from the same structure with a specified time delay.
The program’s output demonstrates that the producer thread begins by adding an even number into the blocking queue, and the consumer thread only retrieves an element if it exists in the queue. The producer thread then resumes the insertion process because the array blocking queue has the space to accommodate new elements.
If you’re building highly concurrent systems, then BlockingQueue implementations are great for ensuring thread safety in producer-consumer processes such as event-streaming applications.
Implementing the Java Queue Interface
Using the Java Queue Interface in your applications enables the inclusion of robust data handling and processing structures that support element ordering, by default, and efficient methods for data manipulation, through implementing classes such as the LinkedList.
Even in cases where thread-safety is critical, the Queue interface supports safe multithreading while enforcing memory efficiency, making it ideal for many business cases that demand optimized data structures and related processes.