## 232. Implement Queue using Stacks

Implement the following operations of a queue using stacks.

• push(x) -- Push element x to the back of queue.
• pop() -- Removes the element from in front of queue.
• peek() -- Get the front element.
• empty() -- Return whether the queue is empty.
Notes:
• You must use only standard operations of a stack -- which means only `push to top`, `peek/pop from top`, `size`, and `is empty` operations are valid.
• Depending on your language, stack may not be supported natively. You may simulate a stack by using a list or deque (double-ended queue), as long as you use only standard operations of a stack.
• You may assume that all operations are valid (for example, no pop or peek operations will be called on an empty queue).

b'
\n\n

\n

\n

## Solution

\n

Queue is FIFO (first in - first out) data structure, in which the elements are inserted from one side - `rear` and removed from the other - `front`.\nThe most intuitive way to implement it is with linked lists, but this article will introduce another approach using stacks.\nStack is LIFO (last in - first out) data structure, in which elements are added and removed from the same end, called `top`.\nTo satisfy FIFO property of a queue we need to keep two stacks. They serve to reverse arrival order of the elements and one of them store the queue elements in their final order.

\n
\n

#### Approach #1 (Two Stacks) Push - O(n) per operation, Pop - O(1) per operation.

\n

Algorithm

\n

Push

\n

A queue is FIFO (first-in-first-out) but a stack is LIFO (last-in-first-out). This means the newest element must be pushed to the bottom of the stack. To do so we first transfer all `s1` elements to auxiliary stack `s2`. Then the newly arrived element is pushed on top of `s2` and all its elements are popped and pushed to `s1`.

\n

\n

Figure 1. Push an element in queue

\n

Java

\n
`private int front;\n\npublic void push(int x) {\n    if (s1.empty())\n        front = x;\n    while (!s1.isEmpty())\n        s2.push(s1.pop());\n    s2.push(x);\n    while (!s2.isEmpty())\n        s1.push(s2.pop());\n}\n`
\n

Complexity Analysis

\n
\n
• Time complexity : .
• \n
\n

Each element, with the exception of the newly arrived, is pushed and popped twice. The last inserted element is popped and pushed once. Therefore this gives operations where is the queue size. The `push` and `pop` operations have time complexity.

\n
\n
• Space complexity : .\nWe need additional memory to store the queue elements
• \n
\n

Pop

\n

The algorithm pops an element from the stack `s1`, because `s1` stores always on its top the first inserted element in the queue.\nThe front element of the queue is kept as `front`.

\n

\n

Figure 2. Pop an element from queue

\n

Java

\n
`// Removes the element from the front of queue.\npublic void pop() {\n    s1.pop();\n    if (!s1.empty())\n        front = s1.peek();\n}\n`
\n

Complexity Analysis

\n
\n
• Time complexity : .
• \n
• Space complexity : .
• \n
\n

Empty

\n

Stack `s1` contains all stack elements, so the algorithm checks `s1` size to return if the queue is empty.

\n
`// Return whether the queue is empty.\npublic boolean empty() {\n    return s1.isEmpty();\n}\n`
\n

Time complexity : .

\n

Space complexity : .

\n

Peek

\n

The `front` element is kept in constant memory and is modified when we push or pop an element.

\n
`// Get the front element.\npublic int peek() {\n  return front;\n}\n`
\n

Time complexity : .\nThe `front` element has been calculated in advance and only returned in `peek` operation.

\n

Space complexity : .

\n
\n

#### Approach #2 (Two Stacks) Push - O(1) per operation, Pop - Amortized O(1) per operation.

\n

Algorithm

\n

Push

\n

The newly arrived element is always added on top of stack `s1` and the first element is kept as `front` queue element

\n

\n

Figure 3. Push an element in queue

\n

Java

\n
`private Stack<Integer> s1 = new Stack<>();\nprivate Stack<Integer> s2 = new Stack<>();\n\n// Push element x to the back of queue.\npublic void push(int x) {\n    if (s1.empty())\n        front = x;\n    s1.push(x);\n}\n`
\n

Complexity Analysis

\n
\n
• Time complexity : .
• \n
\n

\xd0\x90ppending an element to a stack is an O(1) operation.

\n
\n
• Space complexity : .\nWe need additional memory to store the queue elements
• \n
\n

Pop

\n

We have to remove element in front of the queue. This is the first inserted element in the stack `s1` and it is positioned at the bottom of the stack because of stack\'s `LIFO (last in - first out)` policy. To remove the bottom element from `s1`, we have to pop all elements from `s1` and to push them on to an additional stack `s2`, which helps us to store the elements of `s1` in reversed order. This way the bottom element of `s1` will be positioned on top of `s2` and we can simply pop it from stack `s2`. Once `s2` is empty, the algorithm transfer data from `s1` to `s2` again.

\n

\n

Figure 4. Pop an element from stack

\n

Java

\n
`// Removes the element from in front of queue.\npublic void pop() {\n    if (s2.isEmpty()) {\n        while (!s1.isEmpty())\n            s2.push(s1.pop());\n    }\n    s2.pop();    \n}\n`
\n

Complexity Analysis

\n
\n
• Time complexity: Amortized , Worst-case .
• \n
\n

In the worst case scenario when stack `s2` is empty, the algorithm pops elements from stack s1 and pushes elements to `s2`, where is the queue size. This gives operations, which is . But when stack `s2` is not empty the algorithm has time complexity. So what does it mean by Amortized ? Please see the next section on Amortized Analysis for more information.

\n
\n
• Space complexity : .
• \n
\n

Amortized Analysis

\n

Amortized analysis gives the average performance (over time) of each operation in the worst case. The basic idea is that a worst case operation can alter the state in such a way that the worst case cannot occur again for a long time, thus amortizing its cost.

\n

Consider this example where we start with an empty queue with the following sequence of operations applied:

\n

\n\n

\n

The worst case time complexity of a single pop operation is . Since we have pop operations, using the worst-case per operation analysis gives us a total of time.

\n

However, in a sequence of operations the worst case does not occur often in each operation - some operations may be cheap, some may be expensive. Therefore, a traditional worst-case per operation analysis can give overly pessimistic bound. For example, in a dynamic array only some inserts take a linear time, though others - a constant time.

\n

In the example above, the number of times pop operation can be called is limited by the number of push operations before it. Although a single pop operation could be expensive, it is expensive only once per `n` times (queue size), when `s2` is empty and there is a need for data transfer between `s1` and `s2`. Hence the total time complexity of the sequence is : `n` (for push operations) + `2*n` (for first pop operation) + `n - 1` ( for pop operations) which is .This gives = average time per operation.

\n

Empty

\n

Both stacks `s1` and `s2` contain all stack elements, so the algorithm checks `s1` and `s2` size to return if the queue is empty.

\n
`// Return whether the queue is empty.\npublic boolean empty() {\n    return s1.isEmpty() && s2.isEmpty();\n}\n`
\n

Time complexity : .

\n

Space complexity : .

\n

Peek

\n

The `front` element is kept in constant memory and is modified when we push an element. When `s2` is not empty, front element is positioned on the top of `s2`

\n
`// Get the front element.\npublic int peek() {\n    if (!s2.isEmpty()) {\n            return s2.peek();\n    }\n    return front;\n}\n`
\n

Time complexity : .

\n

The `front` element was either previously calculated or returned as a top element of stack `s2`. Therefore complexity is \n

\n

Space complexity : .

\n

Analysis written by: @elmirap.

\n
'