Serverless Cloud Pizzeria Shop

Darryl Ruggles
Darryl Ruggles
··

8 min read

Cover Image for Serverless Cloud Pizzeria Shop

I have spent lots of time learning about various areas of software and cloud development so far in my career and wanted to work on a project that combined many of those components. In recent years most of my focus has been on backend technologies but i wanted to get back to building some full-stack solutions. The full source for this project can be found here in Github: https://github.com/RDarrylR/serverless-pizza-ordering

Recently we have all heard a lot about the company Momento and the products they offer include Topics and Caches. These are truly serverless, scale-to-zero, pay as you go products that implement high-speed caching and pub/sub messaging solutions you can use in your projects.

I wanted to come up with an example project that could take advantage of these. What I am presenting today is a web-based ordering system (developed using React JS) that includes live updates of the progress of your pizza orders using Momento Topics. The backend for the project is in AWS and uses various components including API Gateway, AWS Lambda, Step Functions, the Elastic Container Service (ECS) with Fargate compute, DynamoDB with streams, and more. It uses a mix of Python and Rust for the code.

A common Momento Topic is used between the frontend app and the backend for each order. The frontend has a short-lived token that allows it to subscribe to the topic and receive updates. The backend pushes updates to the topic as the order progresses. Powerful AI actually creates the pizzas and delivers them to the customers based on some containers running in the ECS cluster. Of course we could have used AWS Lambda functions for the actual making of the pizzas and the delivery but the AI tells me there are some rare cases where it could take more than 15 minutes and that they prefer working with containers rather than some Firecracker Micro-VM when pizza is involved.

The Infrastructure as Code (IaC) tool i used for this project is Terraform. I like to use the Serverless Application Model (SAM) as well for my projects but Terraform does a great job when there are a lot of components required to be setup. In this project we are creating a VPC, subnets, ECS clusters, Step Functions, an API Gateway, and much more. Terraform can handle this all cleanly and with a nice set of config files.

Frontend

The front end consists of a typical online storefront (coded with React JS) with a list of products - which (in the Cloud Pizzeria) are one-size fits all types of classic pizzas. For some reason we did include Hawaiian pizza but that must have been a glitch in the system. Please note that you cannot customize the toppings on each pizza as the AI is very fussy claiming each of their creations is a classic and you can pick things off when you get it if you’re are really that concerned. You can choose between super-fast AI-based delivery or pick up the pizzas yourself.

The store allows you to see the descriptions of the pizzas, add them to a cart, etc. Before you can place an order you need to fill in your profile information including name, address, and phone number - after all the AI needs to know where to deliver it or who to call if you don’t show up.

Here are some pics of the store:

Product list

Product Details

Once you have chose your products and filled in a profile you can go to the cart and complete your order. Make sure to pick your order type of Delivery or Pickup. After this you can sit back and watch the progress as the powerful AI makes your pizza and delivers it to your door (or at least tells you it is done if you chose to pick it up).

Checkout page

After you complete the purchase in the frontend, you are presented with a status of your order. As the order progresses you will see live updates and timestamps on them. Once your order is ready close the status page and get ready to order again.

Order started

Order Progress

Order is Ready

The key to the front end getting the updates is it subscribes to the Momento Topic that the backend passed it an ID for. Once it’s listening for these updates it will be able to update the progress as the images above show. Below is a snippet of the code to set this up.

const topicClient = new TopicClient({
  configuration: TopicConfigurations.Browser.latest(),
  credentialProvider: CredentialProvider.fromString({
    apiKey: data.token
  })
}) 

...


await topicClient.subscribe(process.env.REACT_APP_MOMENTTO_CACHE, topic, {
  onItem: (item => {
    console.log('Received item:', item.value());
    try {
      const status = JSON.parse(item.value());
      console.log("status=", status.State)
      onOrderStatusUpdate(status);
    } catch (error) {
      console.error('Error parsing status update:', error);
      onOrderStatusUpdate({ message: item.value() });
    }
  }),
  onError: (error) => {
    alert(`Error subscribing to Momento topic: ${error.message}`)
  }
})

Backend design

The backend of this project is driven by API Gateway and AWS Step Functions. The API that is used to create orders is setup in API Gateway. The API Gateway is built using an Open API spec file as setting it up piece by piece in Terraform is rather ugly. The create order call to API gateway is backed by an AWS Lambda function coded in Python. This function creates an entry for the order in a DynamoDB table and generates a temporary token using the Momemto Auth Client, which is specific to the current order, and sends it back to the client in the response. Below is a snippet of the Lambda handler.

@tracer.capture_lambda_handler
@logger.inject_lambda_context(log_event=True)
@metrics.log_metrics(capture_cold_start_metric=True)
def lambda_handler(event: Dict[str, Any], context: Any) -> Dict[str, Any]:
    try:
        body: Dict[str, Any] = json.loads(event['body'], parse_float=Decimal)
        order_id: str = str(uuid.uuid4())
        timestamp: str = datetime.now(UTC).isoformat()

        item: Dict[str, Any] = {
            'orderId': order_id,
            'timestamp': timestamp,
            'status': 'PENDING',
            'items': body.get('items', []),
            'customer': body.get('customer', {}),
            'orderType': body.get('orderType', ''),
            'totalAmount': body.get('totalAmount', 0)
        }  

        table.put_item(Item=item)

        # Create an auth token so the user can track their order using the momento topic for the order
        momento_response = momento_auth_client.generate_disposable_token(
                    DisposableTokenScopes.topic_subscribe_only(momento_cache_name, f"{momento_topic_prefix}{order_id}"),
                    ExpiresIn.minutes(60))

        match momento_response:
            case GenerateDisposableToken.Success():
                logger.info("Successfully generated a disposable token", 
                            extra={
                                "auth_token": momento_response.auth_token,
                                "endpoint": momento_response.endpoint
                            })
            case GenerateDisposableToken.Error() as error:
                logger.info(f"Error generating a disposable token", 
                            extra={"error": error.message})

        return {
            'statusCode': 201,
            'headers': {
                'Access-Control-Allow-Origin': '*',
                'Access-Control-Allow-Headers': 'Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token',
                'Access-Control-Allow-Methods': 'OPTIONS,POST,GET'
            },
            'body': json.dumps(
                {
                    'orderId': order_id, 
                    'message': 'Order created successfully',
                    'token': momento_response.auth_token if hasattr(momento_response, 'auth_token') else None
                })
        }

The AWS Lambda function does not initiate the state machine that drives the order processing. I decided to go for a more event-driven approach and wanted to use DynamoDB streams. When the DynamoDB table has the new order inserted from the function above, the stream setup on the DynamoDB table will emit an event. The process_dynamodb_stream Lambda function - which has an Event Source Mapping (ESM) of events on this table setup on it - will be executed based on updates to the DynamoDB table.

The process_dynamodb_stream will parse the event and if it’s a new order in the PENDING state it will initiate an execution of the Process-Pizza_Order-State-Machine Step Function for the new order. Below is a snippet of the process_dynamodb_stream function.

@tracer.capture_lambda_handler
@logger.inject_lambda_context(log_event=True)
@metrics.log_metrics(capture_cold_start_metric=True)
def lambda_handler(event: Dict[str, Any], context: LambdaContext) -> Dict[str, Any]:
    try:
        process_partial_response(event=event, record_handler=record_handler, processor=processor, context=context)
        return {"statusCode": 200, "body": json.dumps({"message": "Stream processing completed successfully"})}

    except Exception as e:
        logger.exception("Error processing DynamoDB stream")
        return {"statusCode": 500, "body": json.dumps({"error": str(e)})}

def process_new_order(new_image: Dict[str, Any]) -> None:
    logger.info(f"Processing new order with new image: {new_image}")
    order_id = new_image.get("orderId", {})
    status = new_image.get("status", {})

    if status == "PENDING":
        try:
            execution_id = stepfunctions.start_execution(
                stateMachineArn=os.environ['STATE_MACHINE_ARN'],
                input=json.dumps(
                    {
                        "orderId": order_id,
                        "orderType": new_image.get("orderType", {}),
                        "ordersTableName": os.environ['ORDERS_TABLE'],
                        "customer": new_image.get("customer", {})
                     }                     
                )
            )
            logger.info(f"Started state machine execution for order={order_id}, execution_id={execution_id}")
        except Exception as e:
            logger.error(f"Failed to start state machine for order {order_id}: {str(e)}")

The AWS Step Function is the main driver of the workflow to process the order. Step Functions can call almost any AWS service or any type of external API. In the example I am using the HTTP Task type in Step Functions to call the Momemento Topics HTTP API to update status of the order being processed. The actual work of making the pizza and delivering it is being done by advanced AI which is controlled using containers running in ECS with Fargate compute. The state machine passes a Task Token to the ECS Tasks which have to return it when they are done processing their work. Inside the ECS Tasks, the status of the order gets updated in the DynamoDB tables. The state machine also has to know whether to send the AI to deliver the order or whether the customer is picking it up. Below is a diagram of the state machine and one showing an example of a completed order.

State Machine

Order workflow execution

Container Code

The code needed to control the AI to actually make the pizzas and deliver them is super complex and they insisted I use Rust for it since it is the only thing the AI is told it can trust. Unfortunately, I had to remove the core of that code before i checked it into Github but left a skeleton of it. It still shows the Step Functions token handling and the DynamoDB updates.

Try the example in your AWS account

You can clone the Github Repo and try this out in your own AWS account. The README.md file mentions any changes you need to make for it to work in your AWS account.

Please let me know if you have any suggestions or problems trying out this example project.

For more articles from me please visit my blog at Darryl's World of Cloud or find me on X, LinkedIn, Medium, Dev.to, or the AWS Community.

For tons of great serverless content and discussions please join the Believe In Serverless community we have put together at this link: Believe In Serverless Community