In this article, we will see how it can be done using NodeJS with Express and MongoDB.
Other articles in the series are:
Before we dive into the code, it is important to understand the different components that will be used for the backend server. In this case, we will use NodeJS as the web server to host our backend application. To ease the development of the application, we will add a component called Express.
Besides the web server which will serve the data from our API, we need to store data. When it comes to data storage, we have, of course, the file system and two worlds exist with their benefits and disadvantages: SQL databases and NoSQL databases. You can find a lot of articles on the web about the benefits of each kind of data storage. I decided to use a NoSQL option and more specifically MongoDB.
To build the application, I decided to use containerized version of MongoDB. It means that I installed Docker Desktop on my machine and started a MongoDB pod using Docker Desktop. To get more information about the MongoDB image and how you can use it, please refer to this documentation. If you prefer to manually install MongoDB on your environment, it can also be done, or you can even use MongoDB Atlas which offers a free subscription.
To start the backend project, you need to have NodeJS installed (which should already be the case if you followed along with the other articles in the series).
Here at the steps to create the project:
The packages that you install have different purposes:
After execution, open the package.json located in the root folder and add the following parameter:
Select any image to see a larger version.
Mobile users: To view the images, select the "Full" version at the bottom of the page.
Adding the type parameter and setting it to module will tell NodeJS, that imports should be done using ES6 modules and not using the default CommonJS packages. NodeJS supports the two approaches but as React uses the ES6 imports, I decided to use the same technique for the backend to be consistent. If you prefer to use CommonJS, it is also possible, but you need to adapt the code that will be used in this article.
Now that you have installed all the components, it is time to create the folder structure of the application. You should create the following structure:
Let me explain the structure of the application.
We will start with the db.config.js file. The file contains information about the MongoDB database.
By default, when you start the MongoDB container in Docker Desktop, the database can be reached on localhost:27017. This can be changed but it is good enough for our application. MongoDB and mongoose are flexible enough to create the database for you when needed.
This means that making sure that the Docker image is running is the only requirement. There is no other configuration needed.
You need to export the dbConfig object to be able to import it in other files.
Now that we have defined the connection to the database. We should define the data structures for the different data collections: preferences and shortcuts.
Here are the codes for the models:
The first step is to import mongoose. When done, the code creates an element schema which contains the label, the href and the position like we had in the SHORTCUTS.json file in the React application. The switch between static files and the data API should be as transparent as possible.
The schema variable contains information about the user and an array of elements which are in fact using the element schema as data structure.
The next piece of code is a little trick that you can use to display the id and not _id when exporting the data in JSON format.
The final steps are to create the model using the schema and export it.
The same logic is applied for the preferences model. The data structure has just an extra layer with the info schema.
With the models in place, we need to make them available to other components. Therefore, the index.js will act as a wrapper for the database information. It imports the different modules and then creates a db object which can be used by other components.
Defining the database structure is the first step but the application needs to perform CRUD operations on the data. This is the role of the controllers. The two controllers will implement the following methods:
The client application will not implement user interfaces for all these functionalities but at least they are available, and they can be used to populate data.
As you can see in the code, the db object containing the models is imported and then used in the different methods. The fact that we are using mongoose eases the development as mongoose provides functions to perform the basic CRUD operations. The code that you must write is to check that the input has the needed information to execute the operation and to write the response that is sent back to the client application.
The same logic is applied to the shortcuts controller.
With the models, we have the data structure and with the controllers we have the actions which manipulate the data. The missing piece is the routing. Without the routes, the web server will not know which function should be executed in specific situations.
The routes are easy to build. You are mapping a verb: post, get, put, update to an action in the controller.
Here are the routes:
As you can see in the route's definition, we can pass URL parameters. These parameters are used in the controller to filter the data.
We are now close to a working backend server. We need to adapt the server.js file to use the routes and do some extra configuration for CORS for example.
Here is the content of the server.js file.
As you can see, there is no need to import the controllers or the models. They are hidden in the routes.
On line 9, we are setting the CORS options. Doing so allows connections to the server only from https://localhost:3000 . This way, the sasportal client application can access it.
On line 17, we initiate the connection with the MongoDB database.
On line 27 and 28, we tell the application that when a request is send to "/api/preferences" or "/api/shortcuts" it should use the respective routes to handle the request.
The last piece of code defines the port on which the web server will be listening for requests.
After saving the files, you can start the web server using the following command from a terminal: nodemon start
The server will start, and the following messages will be seen in the terminal:
You can now test the application using curl commands or using Postman. If you are using Postman, you can find in the Git repository a file named sasportal.postman_collection.json under the test folder. You can import that file into your Postman installation. After importing the file, the collection will appear in Postman and the queries can be executed.
While testing the API, you will also load data into MongoDB. The data will be used when integrating with the portal application. If you followed along with the articles in the series, you may use other reports objects or shortcuts. You may to adapt the JSON data in Postman to match with your specific configuration.
As we have seen earlier, the data that we have loaded in MongoDB are equivalent to the ones which were used in the portal application and which were stored under /public/data/ folder. It means that if we now want to use the data stored in MongoDB, we should adapt two components: LandingPage.js and Shortcuts.js. You don't need to rewrite the complete component to use the API, you will just adapt the way the data is retrieved.
The original function in the LandingPage.js is:
The adapted function is:
As you see, we are accessing the data from http://localhost:8080/api/preferences/_default_ where the API is running. We are retrieving data for the _default_ user. Another difference with the original code is that we need to retrieve the elements property.
The same kind of modifications should be made in Shortcuts.js.
The adapted function is:
The same changes were made to the URL but also to retrieve the elements property.
After these changes, the application works the same way as before, but the data are now retrieved from an API which collects dynamically data from MongoDB instead of static data stored in the client application.
As we have seen in this article, you can easily create a backend application which uses MongoDB to store data. Using this basic application, you can add more models, more controllers, and more routes to the application to suit your needs. This change is transparent for the end-users, but it provides more flexibility when developing your application.
The backend and frontend are now completely decoupled. It means also that you can switch to another backend technology without impact on the frontend. The application that we have created uses a simplistic configuration. You can of course change the user and password that are used in MongoDB, you can configure an authentication mechanism for the backend server or define specific authorizations. All these considerations are left aside from this application for one good reason: when the complete application will be deployed in Kubernetes, MongoDB and the backend application will only be accessible from the cluster and only the frontend will be accessible from the external world. It means that unless an application is deployed in the same cluster, there will be no way to access the API nor the database. But we will see that in the next article in the series.
Find more articles from SAS Global Enablement and Learning here.
Registration is open! SAS is returning to Vegas for an AI and analytics experience like no other! Whether you're an executive, manager, end user or SAS partner, SAS Innovate is designed for everyone on your team. Register for just $495 by 12/31/2023.
If you are interested in speaking, there is still time to submit a session idea. More details are posted on the website.
Data Literacy is for all, even absolute beginners. Jump on board with this free e-learning and boost your career prospects.