A tutorial showing how to deploy SpaCy on a large scale

Author : mineshparikh21
Publish Date : 2021-06-05 17:24:49


A tutorial showing how to deploy SpaCy on a large scale

Background:

SpaCy is one of my favourite NLP libraries. And I have been using spaCy to perform a lot of Named Entity Recognition (NER) tasks. Generally, we first need to load a spaCy pre-trained model of a specific language and fine-tune the model with our training dataset. The training process can be done offline with a local computer and we can even test the fine-tuned model performance by hosting it locally through Flask / Streamlit.

Although I have found many great tutorials on deploying a spaCy model locally with Flask / Streamlit, there are not many tutorials on how to deploy it on a larger scale, for example, how to deploy a spaCy model with AWS.

It’s a very interesting topic and after a great amount of work I summarized my solution into this article; hopefully it can be useful for people facing the same question.

Introduction:

In this article, I will explain my solution on how to deploy a custom spaCy model with AWS services including:

  • AWS ECR (Elastic Container Registry)
  • AWS SageMaker
  • AWS Lambda
  • AWS S3 Bucket (Optional)

Here is my plan 🧗🏻:

  • First, data input can be sent as an event into the AWS Lambda.
  • Then, within the Lambda, we invoke a SageMaker endpoint. The endpoint takes the data input coming from the Lambda event and returns a response containing the results from the spaCy model.
  • Finally, this endpoint response will display as an execution result. Hence we are able to check the response on the AWS Lambda result tab or debug on the AWS CloudWatch.

Through this process, the spaCy model is hosted on AWS SageMaker and can be invoked as an endpoint at any time. To create the SageMaker endpoint, we need to create our custom spaCy model container on the AWS ECR first and prepare our model artifact ready in the format of tar.gz.

To create the model artifact, we could do it either offline or online. Every time we finish training a spaCy model, the trained model is actually saved into a folder. The folder usually contains the following files:

*file names might be different

  • ner (folder)
  • tokenizer
  • vocab (folder)

In terms of creating the model artifact, we could simply compress this folder into a tar.gz format and upload it to a S3 bucket. Alternatively, we could also train the model online with the AWS SageMaker by creating a training job under the SageMaker Training section. But we need to provide our custom training image from AWS ECR and training data from AWS S3 bucket.

Hopefully, you are now a bit clearer about the whole workflow. Now, let’s begin from creating our custom spaCy model container and uploading it onto the AWS ECR as our training image.

Creating A Custom Container on the AWS ECR

Because spaCy is not one of the SageMaker built-in algorithms. We have to create a spaCy model container first on the AWS ECR, and specify the container as the training image while creating a model endpoint on the SageMaker.

SageMaker provides 2 options wherein the first option is to use built-in algorithms that SageMaker offers that includes KNN, XgBoost, Linear Learner, etc. while the other option is to use your custom docker container from ECR..

The above article is an amazing reference that inspired me on how to deploy a custom spaCy container. I strongly encourage to read through this article and have a deeper understanding on how this docker container works internally. It was quite complicated to understand at first for me. 🧠

In brief, it will include the following steps:

  • Edit the files under the Github repo (mainly the docker file, train
  • py and predictorpy) in order to train a custom spaCy model and return predictions based on the data input.
  • Check the results by performing a local test for the spaCy docker container and debug the code if any error happens.
  • Finally, upload the spaCy container to the AWS ECR.

A bit more explanation about the trainpy and because these are the two files that determine how we want to train the model and what we expect the input & output of the model to be like.

In the train file, the objective is to modify the code to train a custom spaCy model. After reading the data from CSV files into a dataframe called train_data, we need to convert the dataframe into a format that spaCy model can take as input. I would recommend to read a previous article I wrote about how to train a spaCy model for an NER task.

In the predictor file, we need to edit the code under the predict function. The predict function only takes test_data as input. The test_data is in format of a dataframe already so we could apply the trained spaCy model onto the column which contains the source text input. Then we could create a new column for the model’s predictions. If it’s an NER task, then the new column would contain the extracted entities from each input. Also, we could adjust whether the predict function returns just a list of predictions or a whole dataframe that contains both the source and the prediction columns.

At the end of the predictor file, it requires a bit knowledge of Flask app to understand. Although it’s recommended to know how it works, in fact, we don’t need to modify many parts in the original Flask app. Just be sure that the output data format from the predict function is in consistent with the format used in the Flask app. The original function returns prediction in format of a list. For me, I changed it to return a dataframe so I modified the code in the Flask app to keep the consistency.

Finally, in the docker file, simply edit the code accordingly based on the libraries needed. For me, I added RUN pip3 install -U spacy== to install the correct version of spaCy I used.

Congratulations if you got here! 🥳

To push our container image to Amazon ECR, we could follow the code inside (from the above article’s Github)

These files are well explained in that article. We could simply run through the code to push our own custom spaCy container onto the AWS ECR.

Until this step, we have finally uploaded our custom spaCy container to the AWS ECR. This is a critical step for the model deployment. Now, we will switch to AWS SageMaker and create a model endpoint.



Category : general

Masturbation-An Overview

Masturbation-An Overview

- Destroying your semen by your hand is called masturbation. By sitting in bad peoples society, watching provocative films and reading pornographic books


Real "2020" MD-101 Questions & Answers | Updated Braindumps

Real "2020" MD-101 Questions & Answers | Updated Braindumps

- Everyone wants to pass the exam in first try. Visit CertsAdvice website for an easy preparation of your exam


Information About VMware 3V0-21.21 Certification Exam

Information About VMware 3V0-21.21 Certification Exam

- Marketing automation is one of the great processes that help businesses not only to automate their repetitive marketing tasks.


Secrets to Pass SAP C-S4FTR-1809 Certification Exams With Ease In 2021

Secrets to Pass SAP C-S4FTR-1809 Certification Exams With Ease In 2021

- Homeschoolers entail them selves with a lot of local community sources on a lot of different quantities. 1st and several common,