Rust and WebAssembly Serverless Functions in Vercel – InApps is an article under the topic Software Development Many of you are most interested in today !! Today, let’s InApps.net learn Rust and WebAssembly Serverless Functions in Vercel – InApps in today’s post !

Read more about Rust and WebAssembly Serverless Functions in Vercel – InApps at Wikipedia



You can find content about Rust and WebAssembly Serverless Functions in Vercel – InApps from the Wikipedia website

Michael Yuan

Michael Yuan is the founder and maintainer of WasmEdge, an open-source WebAssembly runtime hosted by the CNCF for edge computing, service mesh, and embedded functions. Dr. Yuan is the author of six books on software engineering, and a long-time open source contributor. He is the co-founder of Second State, a venture-funded startup that supports and commercializes enterprise applications in the WebAssembly and Rust ecosystem.

Vercel is a leading platform for developing and hosting Jamstack applications. Unlike traditional web apps, where the UI is dynamically generated at runtime from a server, a Jamstack application consists of a static UI (in HTML and JavaScript) and a set of serverless functions to support dynamic UI elements via JavaScript.

There are many benefits to the Jamstack approach. But perhaps one of the most significant benefits is performance. Since the UI is no longer generated at runtime from a central server, there is much less load on the server and we can now deploy the UI via edge networks such as CDNs.

However, the edge CDN only solves the problem of distributing the static UI files. The backend serverless functions could still be slow. In fact, popular serverless platforms have well-known performance issues, such as slow cold start, especially for interactive applications. That’s where WebAssembly could help.

With WasmEdge, a cloud native WebAssembly runtime hosted by the Cloud Native Computing Foundation, developers can write high-performance serverless functions deployed on the public cloud or on edge computing nodes. In this article, we will explore how to use WasmEdge functions, written in Rust, to power a Vercel application backend.

Why WebAssembly in Vercel Serverless?

The Vercel platform already has a very easy-to-use serverless framework for deploying functions hosted in Vercel. As we discussed above, the reason to use WebAssembly, and WasmEdge, is to further improve performance. High-performance functions written in C/C++, Rust, and Swift can be easily compiled into WebAssembly. Those WebAssembly functions are much faster than JavaScript or Python commonly used in serverless functions.

However, if raw performance is the only goal, why not just compile those functions to machine native executables? That is because the WebAssembly “container” still provides many valuable services.

For starters, WebAssembly isolates the function at runtime. Bugs or memory safety issues in the code will not propagate out of the WebAssembly runtime. As the software supply chain gets more complex over time, it is important to run code in containers to prevent unauthorized access to your data by dependency libraries.

Second, the WebAssembly bytecode is portable. Developers only need to build it once and do not need to worry about changes or updates to the underlying Vercel serverless container (OS and hardware) in the future. It also allows developers to reuse the same WebAssembly functions in alternative hosting environments, such as in another public cloud such as Tencent Serverless Functions, or in a data streaming framework like YoMo.

Read More:   What We Can Still Learn from Xanadu – InApps Technology 2022

Finally, the WasmEdge Tensorflow API provides the most ergonomic way to execute Tensorflow models in the Rust programming language. WasmEdge installs the correct combination of Tensorflow dependency libraries, and provides a unified API for developers.

Enough with the concepts and explanations. Without further ado, let’s jump into the example apps!

Prerequisite

Since our demo WebAssembly functions are written in Rust, you will need a Rust compiler. Make sure that you install the wasm32-wasi compiler target as follows, in order to generate WebAssembly bytecode.

The demo application front end is written in Next.js, and deployed on Vercel. We will assume that you already have the basic knowledge of how to work with Vercel.

Example 1: Image processing

Our first demo application allows users to upload an image and then invoke a serverless function to turn it into black and white. A live demo deployed on Vercel is available.


Fork the demo application’s GitHub repo to get started. To deploy the application on Vercel, just import the Github repo from Vercel for Github web page.

This repo is a standard Next.js application for the Vercel platform. The backend serverless function is in the api/functions/image_grayscale folder. The src/main.rs file contains the Rust program’s source code. The Rust program reads image data from the STDIN, and then outputs the black-white image to the STDOUT.

You can use Rust’s cargo tool to build the Rust program into WebAssembly bytecode or native code.

Copy the build artifacts to the api folder.

 

Vercel runs api/pre.sh upon setting up the serverless environment. It installs the WasmEdge runtime, and then compiles each WebAssembly bytecode program into a native so library for faster execution.

The api/hello.js file conforms Vercel serverless specification. It loads the WasmEdge runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via STDIN. Notice api/hello.js runs the compiled grayscale.so file generated by api/pre.sh for better performance.

That’s it. Deploy the repo to Vercel and you now have a Vercel Jamstack app with a high-performance Rust and WebAssembly based serverless backend.

Example 2: AI inference

The second demo application allows users to upload an image and then invoke a serverless function to classify the main subject on the image.


It is in the same GitHub repo as the previous example but in the tensorflow branch. Note: when you import this GitHub repo on the Vercel website, it will create a preview URL for each branch. The tensorflow branch would have its own deployment URL.

The backend serverless function for image classification is in the api/functions/image-classification folder in the tensorflow branch. The src/main.rs file contains the Rust program’s source code. The Rust program reads image data from the STDIN, and then outputs the text output to the STDOUT. It utilizes the WasmEdge Tensorflow API to run the AI inference.

You can use the cargo tool to build the Rust program into WebAssembly bytecode or native code.

Copy the build artifacts to the api folder.

Again, the api/pre.sh script installs WasmEdge runtime and its Tensorflow dependencies in this application. It also compiles the classify.wasm bytecode program to the classify.so native shared library at the time of deployment.

The api/hello.js file conforms Vercel serverless specification. It loads the WasmEdge runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via STDIN. Notice api/hello.js runs the compiled classify.so file generated by api/pre.sh for better performance.

You can now deploy your forked repo to Vercel and have a web app for food classification.

What’s Next

Running WasmEdge from Vercel’s current serverless container is an easy way to add high-performance functions to Vercel applications. If you have created an interesting Vercel function using WasmEdge, please let us know and we will send you a WasmEdge SWAG!

Going forward an even better approach is to use WasmEdge as the container itself. There will be no Docker and no Node.JS to bootstrap WasmEdge. This way, we can reach much higher efficiency for running serverless functions. WasmEdge is already compatible with Docker tools. If you are interested in joining WasmEdge and the Cloud Native Computing Foundation for this exciting work, let us know!

InApps is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Docker.

Feature image via Pixabay.




Source: InApps.net

Rate this post
As a Senior Tech Enthusiast, I bring a decade of experience to the realm of tech writing, blending deep industry knowledge with a passion for storytelling. With expertise in software development to emerging tech trends like AI and IoT—my articles not only inform but also inspire. My journey in tech writing has been marked by a commitment to accuracy, clarity, and engaging storytelling, making me a trusted voice in the tech community.

Let’s create the next big thing together!

Coming together is a beginning. Keeping together is progress. Working together is success.

Let’s talk

Get a custom Proposal

Please fill in your information and your need to get a suitable solution.

    You need to enter your email to download

      Success. Downloading...