The posts tagged with ' Serverless & Schemaless' are listed below. You can get to each post by clicking the title in the list.


Querying Azure Cosmos DB using serverless Node.js

A few days ago I blogged about using Functions, .NET, and Cosmos DB's Graph API together, and as I pointed out in an update to that post accompanying this one, the experience of working with Cosmos DB's Graph API was exciting and had offered some interesting opportunities and mental exercises. The first [somewhat constructive] criticism I received on the initial post was from my friend and colleague Burke Holland, who asked "where's the Node.js love?," to which I replied "I wanted to use the VS Functions tools and I'm more comfortable with .NET and..." and Burke cut me off in mid-message with clear judgment and purpose:

"Nobody puts Node in the corner."

I can't ignore a strong argument that conjures that gem of filmography and choreography, so the second post in the series will give Node.js a break from the battlefield, escorting it right back onto the dancefloor. I'll show you how to build a quick-and-dirty serverless Azure Function that uses the open-source gremlin-secure NPM package to query the same Cosmos DB Graph database from the original post and docs.microsoft.com article Azure Cosmos DB's Graph API .NET Client. The Node.js post will give us an HTTP API we can call when we want to get a list of all the people who know one specific person.

Part 2 of the Serverless & Schemaless Series

Functions and Cosmos DB's nature together create a Serverless & Schemaless Scenario, and the opportunities this scenario provides for agile developers dealing with evolving structures and processes seem vast. This post is one in what I call the Serverless & Schemaless Series:

  1. Querying Azure Cosmos DB's Graph API using an Azure Function - Walks through creating a .NET Function that queries Cosmos DB's Graph API using the Cosmos DB Graph API .NET SDK.
  2. Querying Azure Cosmos DB using serverless Node.js (this post)
  3. TBD
  4. TBD

Creating an Azure Function with Node.js

The Azure Functions JavaScript developer guide has a wealth of information on some of the internal Function SDKs, as well as how to do things like reading environment variables into your code that you configure in the Azure portal, and more. This is definitely recommended reading for more context on some of the lower-level details we won't cover in this post.

Let's dive in and create a new Azure Function, using JavaScript as the language du jour and the HttpTrigger template as a getting-started point.

Create a new trigger

Once the new Function has been created and deployed into your Azure subscription and resource group, the Functions editing experience will open in the portal. Note a few key areas of the window:

  1. The code editor, where we can make updates to our serverless code.
  2. The Test area on the right side of the code editor. This dialog is useful for executing test commands against an Azure Function. The HttpTrigger template has a name parameter expected, so the Request body widget provides a quick way of editing the request content.
  3. The View Files tab. It provides quick access to the *.js, *.json, and other source code files that comprise a Function.

The new Function in the editor

The code below is also available in my fork of the original code used in the article that got me started. Copy this code and paste it into the code editor in the Azure portal. Portions of the source code below was borrowed (okay, it was stolen and bastardized) from the Cosmos DB team's great code sample on Getting Started with Node.js and the Graph API.

var Gremlin = require("gremlin-secure");

// read the configuration from the environment settings
const config = {
    "database" : process.env["database"],
    "collection" : process.env["collection"],
    "AuthKey" : process.env["AuthKey"],
    "nodeEndpoint" : process.env["nodeEndpoint"]
};

// handle the http request
module.exports = function (context, req) {
    context.log('JavaScript HTTP trigger function processed a request.');

    if (req.query.name || (req.body && req.body.name)) {

        // create a new client
        const client = Gremlin.createClient(443, 
            config.nodeEndpoint,
            {
                "session":false,
                "ssl":true,
                "user":"/dbs/" + config.database + "/colls/" + config.collection,
                "password": config.AuthKey
            }
        );

        // execute the query
        client.execute("g.V().outE('knows').where(inV().has('id','" + (req.query.name || req.body.name) + "'))", { }, (err, results) => {
            if (err) return console.error(err);
            console.log(results);

            // return the results
            context.res = {
                // status: 200, /* Defaults to 200 */
                body: results
            };

            context.done();
        });
    }
    else {
        context.res = {
            status: 400,
            body: "Please pass a name on the query string or in the request body"
        };

        context.done();
    }
};

Once you paste in the code the editor should look more like this.

Alt

Note in particular these lines from the source code for the Node.js Function. These lines of code read from the environment variables for the Function, which can be set using the Azure portal.

// read the configuration from the environment settings
const config = {
    "database" : process.env["database"],
    "collection" : process.env["collection"],
    "AuthKey" : process.env["AuthKey"],
    "nodeEndpoint" : process.env["nodeEndpoint"]
};

You can avoid putting the environment variables into your source code and your source control repository. To set up these variables you can go to the Application Settings blade of your Function (or any other App Service). You can get to the Application Settings blade from the Platform features blade in your Function app.

Getting to Application Settings

Each of the Application Settings are available to the Node.js code as environment variables, accessible using the process.env[name] coding pattern. Note the AuthKey setting - it is the same key that's being used by the .NET code from the first post in this series. Since the .NET Function and the Node.js Function would be running in the same App Service instance, both can share these Application settings.

That said, the Gremlin NPM package for Node.js expects a slightly different URL structure than the .NET client, so the nodeEndpoint Application Setting will function to provide the Node.js Function the URL it'll reach out to when communicating with the Graph API.

Configuring environment variables in the portal

Installing Dependencies via NPM

Since the Node.js code for the Azure Function will make use of the gremlin-secure NPM package to issue queries over to the Cosmos DB Graph, I'll need to install that and any other dependent NPM packages into my Function before it'll run properly.

If I was using a Git deploy model here, the Kudu deployment engine is smart enough to know how to install NPM packages for me. But I'm not using Kudu explicitly in this particular App Service, I'm building it using the web-based Azure Portal code editor and testing tools.

Luckilly, Kudu and App Service make it just as easy to install NPM packages using the portal as it is using Git deploy. As with any proper Node.js app, I'll need to add a package.json file to my app so that I can inform NPM which dependent packages the code will need. To create a new file in my Function's file store named package.json.

Alt

Next, add the following code to the package.json file you just added to the Function. The code is also in the repository containing this blog's sample code.

{
  "name": "node-gremlin",
  "version": "0.0.0",
  "description": "",
  "homepage": "",
  "author": {
    "name": "username",
    "email": "email@address.com",
    "url": "yoursite.com"
  },
  "main": "lib/index.js",
  "keywords": [
    "cosmos db",
    "gremlin",
    "node",
    "azure"
  ],
  "license": "MIT",
  "dependencies": {
    "gremlin-secure": "^1.0.2"
  }
}

The screen shot below shows what the package.json looks like once I added it to my own function. Note, I've customized my author metadata and added keywords to the file's content; you may want to customize these to reflect your own... um... meta?

Alt

On the Platform Features blade there's an icon in the Development Tools area called Console. Clicking this icon will bring up an interactive console window, right in the browser. By executing NPM commands using this console it is possible to install all the dependencies your app needs manually.

Alt

The screen shot below shows how I first need to cd into the correct folder to find the package.json I just created. Once I find that folder and move into it, executing an npm install will bring down all the NPM dependencies needed by the Node.js code.

Alt

When I flip back to the View files blade in the portal, I can see that the node_modules folder has been created. The dependencies have been installed, so now I can safely execute to Function and test it out using the portal's development tools.

Alt

Now that all the dependencies have been installed I can test the function directly in the Azure portal.

Testing the Function

The Test segment of the Azure portal is great for making sure Functions operate as expected. The test tool helps debugging and validating that things are working properly.

Using the Request body area of the portal, I can customize the JSON content that will be POSTed over HTTP to the Function endpoint.

Alt

Note the JSON below in the bottom-right corner of the Azure portal. This JSON was returned from the Function based on the request body passed into the server-side function.

Summary

This is the second post in a series about Azure Functions and Cosmos DB, using the Graph API to access the Cosmos DB data. We've covered .NET and Node.js Functions that wrap around the Cosmos DB database, providing a variety of HTTP operations that clients can call to consume the data in the Cosmos DB Graph.

The next post will probably highlight how a single client app could be built to reach out to these back-end Functions, and we'll take a look at how these items could automated in a later post. If you have any inquiries, requests, or ideas, feel free to use the comment tools below to throw something out for me to include in the series and I'll try to add some content and additional posts to the series.

This is an exciting time for doing serverless microservices to make use of a variety of back-end storage containers in Azure, and I'm enjoying learning a few new tricks. Thanks for reading, and happy coding!


Querying Azure Cosmos DB's Graph API using an Azure Function

Azure Cosmos DB is an exciting new way to store data. It offers a few approaches to data storage, one of which has been mysterious to me - Graph APIs. Cosmos DB has support for dynamic graph api structures and allows developers to use the rich Gremlin query syntax to access and update data. When I read the great walk-through article on docs.microsoft.com describing how to use Azure Cosmos DB's Graph API .NET Client I was really excited how dynamic my data could be in the Cosmos DB Graph API. I'd been looking for some demo subject matter to try out the new Azure Functions tools for Visual Studio, so I decided to put these three new ideas together to build an Azure Function that allows a serverless approach to searching a Cosmos DB via the Graph API.

Update: I've made a few updates below after doing some due dilligence on a dependency-related exception I observed during local debugging. If SDKs evolve I'll update the post and make additional updates in this area of the post.

Introducing the Serverless & Schemaless Series

Authoring this post and learning about Graph API was really exciting, and like all good technologists I found myself becoming a little obsessed with the Graph and the opportunities the world of Functions and Cosmos DB has to offer. Functions and Cosmos DB's nature together create a Serverless & Schemaless Scenario, and the opportunities this scenario provides for agile developers dealing with evolving structures and processes seem vast. This post is one in what I call the Serverless & Schemaless Series:

  1. Querying Azure Cosmos DB's Graph API using an Azure Function (this post)
  2. Querying Azure Cosmos DB using serverless Node.js - Walks through creating a Node.js Function that queries Cosmos DB's Graph API using an open-source Gremlin package
  3. TBD
  4. TBD

Building a Function using Visual Studio

The article I mentioned earlier, Azure Cosmos DB's Graph API .NET Client, has a companion GitHub repository containing some great getting-started code. The console application project basically sets up a Cosmos DB database with some sample person data. I forked the companion repository into my own fork here, which contains the basic Function code I'll describe below. So if you just want the code and aren't interested my verbose discussion, you can find it here.

First step should be obvious - we need to add a Function to the solution.

Alt

Once the Function project is created there are a few NuGet related updates and installs we'll need to make. To make sure we're using the latest and greatest Functions SDK, it'd be a good idea to update the Microsoft.NET.Sdk.Functions package.

Alt

The Cosmos DB Graph API team was nice enough to give us a .NET Client SDK, so we should use it. Use the handy NuGet tools to install the Microsoft.Azure.Graphs package.

Alt

During debugging, I noticed a few runtime errors that indicated the Mono.CSharp assemblies couldn't be found. I presume this has something to do with the emulated environment, but don't quote me on that. I followed up Donna Malayeri, one of the awesome Program Managers on the Functions team to get some details here, thinking the reference might indicate a func.exe issue or potential emulator variance. She confirmed there's no dependency on Mono.CSharp in the Functions emulator.

So then I checked in with Andrew Liu, one of the awesome Program Managers in the Cosmos DB team. He confirmed that one of the dependencies in the Cosmos DB SDK is Mono.CSharp. My debugging experience did error with this dependency mentioned during a Graph API call, come to think of it.

I mention all these great folks not to name-drop, but so you know how to find them if you have questions, too. They're super receptive to feedback and love making their products better, so hit them up if you have ideas.

Either way - to debug this guy locally you'll need to install the Mono.CSharp package.

Alt

Once the dependencies are all in place (see the NuGet node in the Solution Explorer below for what-it-should-look-like), we'll need to write some code. To do this, add a new Function item to the project. I've named mine Search.cs, since the point of this Function will be to provide database searching.

Alt

The Function will respond to HTTP requests, so the Http Trigger template is appropriate here. We want this Function to be "wide open," too, so we'll set the Access rights menu to be Anonymous, which lets everyone through.

Alt

Once Search.cs is added to the Function project, add these using statements to the top of the file.

using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;
using Microsoft.Azure.Graphs;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using System;
using System.Collections.Generic;
using System.Configuration;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;

Once those have been added, replace the Function's class code with the code below. The code will simply search the Cosmos DB database using the Graph API for either all the people, or for the specific person identified via the name querystring parameter.

public static class Search
{
    static string endpoint = ConfigurationManager.AppSettings["Endpoint"];
    static string authKey = ConfigurationManager.AppSettings["AuthKey"];

    [FunctionName("Search")]
    public static async Task<HttpResponseMessage> Run(
        [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req,
        TraceWriter log)
    {
        log.Info("C# HTTP trigger function processed a request.");

        // the person objects will be free-form in structure
        List<dynamic> results = new List<dynamic>();

        // open the client's connection
        using (DocumentClient client = new DocumentClient(
            new Uri(endpoint),
            authKey,
            new ConnectionPolicy
            {
                ConnectionMode = ConnectionMode.Direct,
                ConnectionProtocol = Protocol.Tcp
            }))
        {
            // get a reference to the database the console app created
            Database database = await client.CreateDatabaseIfNotExistsAsync(
                new Database
                {
                    Id = "graphdb"
                });

            // get an instance of the database's graph
            DocumentCollection graph = await client.CreateDocumentCollectionIfNotExistsAsync(
                UriFactory.CreateDatabaseUri("graphdb"),
                new DocumentCollection { Id = "graphcollz" },
                new RequestOptions { OfferThroughput = 1000 });

            // build a gremlin query based on the existence of a name parameter
            string name = req.GetQueryNameValuePairs()
                .FirstOrDefault(q => string.Compare(q.Key, "name", true) == 0)
                .Value;

            IDocumentQuery<dynamic> query = (!String.IsNullOrEmpty(name))
                ? client.CreateGremlinQuery<dynamic>(graph, string.Format("g.V('{0}')", name))
                : client.CreateGremlinQuery<dynamic>(graph, "g.V()");

            // iterate over all the results and add them to the list
            while (query.HasMoreResults)
                foreach (dynamic result in await query.ExecuteNextAsync())
                    results.Add(result);
        }

        // return the list with an OK response
        return req.CreateResponse<List<dynamic>>(HttpStatusCode.OK, results);
    }
}

The code is basically the same connection logic as in the original console application which seeded the database, with a simple query to retrieve the matching records.

Debugging the Function Locally

Now that the code is complete, the Functions local debugging tools and emulator can be used to run the code locally so we can test it out.

Before the code will run properly, it must be configured for local execution with the Cosmos DB connection information. The local.settings.json file can be used to configure the Function for local execution much in the same way the App.config file is used to configure the original console application for execution.

Alt

Once the Function app has been configured locally so it knows how to find the Cosmos DB database, hitting F5 will launch the local debugging tool (which probably has the funkiest name of all time), func.exe, with the Function code hosted and ready for use.

At the end of the initial output from func.exe, I'll see that my Function is being hosted at localhost:7071. This will be helpful to test it in a client.

Alt

To test my Function, I'll use Visual Studio Code with Huachao Mao's excellent extension, REST Client. REST Client offers local or remote HTTP request capability in a single right-click. I'll add the URL of my person search function and execute the HTTP request.

Alt

I'm immediately presented with the raw HTTP response from the locally-running Function! Headers, JSON body content, everything.

Alt

By adding the name querystring parameter with a value I know to be in the database, I can filter the results the Function returns.

Alt

Once the Function is validated and seems to be working properly, the last step is publishing it to Azure App Service and configuring it to run in the cloud.

Publishing the Function to Azure

Luckilly, publishing is nearly a single mouse-click. By right-clicking the project I can see the familiar Publish context menu item.

Alt

I'm ready to publish this to the Cloud so it can be tested running in a publicly available scenario. So I'll select the first option, Azure Function App and enable the Create New radio button here so I can create a brand new Function in my Azure subscription.

Alt

The publish panel opens next, allowing me to name my Function. I'll opt for creating a new Consumption-based App Service Plan since I intend on using the pay-per-use billing method for my serverless Function. In addition, I'll create a new Storage Account to use with my Function in case I ever need support for Blobs, Tables, or Queues to trigger execution of other functionality I've not yet dreamed up.

Alt

Clicking the Create button in the dialog will result in all the resources being created in my Azure subscription. Then, Visual Studio will download a publish profile file (which is a simple XML file) that it'll use for knowing how to publish the Function code the next time I want to do so.

Once the Function is published, I can flip to the Azure Portal blade for my Function. There, I'll see a link to the Function's Application settings. I'll need to go here, as this is where I'll configure the live Function for connectivity to the Cosmos DB database with my Person data.

Alt

Just as I did earlier in the console application's App.config file and in the Function app's local.settings.json file, I'll need to configure the published Function with the Endpoint and AuthKey values appropriate for my Cosmos DB database. This way, I never have to check in configuration code that contains my keys - I can configure them in the portal and be sure they're not stored in source control.

Alt

Once the Function is configured properly in my Azure subscription, I can again use the Visual Studio Code REST Client extension to query my publicly-available Function URL.

Alt

Summary

This post summarized how to write a basic Azure Function to search a super-small Cosmos DB database using the Graph API, but there's so much more opportunity here. I enjoyed this quick-and-dirty introduction to using Cosmos DB and Functions together - it really drives home the flexibility of using a serverless back-end together with a schemaless data storage mechanism. These two tools are powerful when used together, to enable really rapid, fluid evolution of an API that can evolve around the underlying data structure.

I'll definitely be investigating Cosmos DB and Azure Functions together for some upcoming side project ideas on my backlog, and encourage you to take a look at it. The sample code in my fork of the repository - though kind of brute-force still - demonstrates how easy it is to get up and running with a serverless front-end atop a Graph database with worldwide distribution capability.