user continuously enter text for entity recognition until they decide to
exit.
11. Analyze the text.
```python
input_text_score = sia.polarity_scores(input_text)["compound"]
input_text_classification = 0 if input_text_score > threshold else 1
```
12. Print the VADER Classification Report and the sentiment analysis.
```python
print(f"Accuracy: {accuracy:.2f}")
print("\nVADER Classification Report:")
print(report_vader)
print(f"\nTest Text (Positive): '{input_text}'")
print(f"Predicted Sentiment: {'Positive' if input_text_classification == 0 else 'Negative'}")
```
13. Create `requirements.txt`. The sample application already contains the
`requirements.txt` file to specify the necessary packages that the
application imports. Open `requirements.txt` in a code or text editor to
explore its contents.
```text
# 01 sentiment_analysis
nltk==3.6.5
...
# 03 text_classification
scikit-learn==1.3.2
...
```
Both the `nltk` and `scikit-learn` modules are required for the text
classification application.
## Explore the application environment
You'll use Docker to run the application in a container. Docker lets you
containerize the application, providing a consistent and isolated environment
for running it. This means the application will operate as intended within its
Docker container, regardless of the underlying system differences.
To run the application in a container, a Dockerfile is required. A Dockerfile is
a text document that contains all the commands you would call on the command
line to assemble an image. An image is a read-only template with instructions
for creating a Docker container.
The sample application already contains a `Dockerfile`. Open the `Dockerfile` in a code or text editor to explore its contents.
The following steps explain each part of the `Dockerfile`. For more details, see the [Dockerfile reference](/reference/dockerfile/).
1. Specify the base image.
```dockerfile
FROM python:3.8-slim
```
This command sets the foundation for the build. `python:3.8-slim` is a
lightweight version of the Python 3.8 image, optimized for size and speed.
Using this slim image reduces the overall size of your Docker image, leading
to quicker downloads and less surface area for security vulnerabilities. This
is particularly useful for a Python-based application where you might not
need the full standard Python image.
2. Set the working directory.
```dockerfile
WORKDIR /app
```
`WORKDIR` sets the current working directory within the Docker image. By
setting it to `/app`, you ensure that all subsequent commands in the
Dockerfile (like `COPY` and `RUN`) are executed in this directory. This also
helps in organizing your Docker image, as all application-related files are
contained in a specific directory.
3. Copy the requirements file into the image.
```dockerfile
COPY requirements.txt /app
```
The `COPY` command transfers the `requirements.txt` file from
your local machine into the Docker image. This file lists all Python
dependencies required by the application. Copying it into the container
lets the next command (`RUN pip install`) install these dependencies
inside the image environment.
4. Install the Python dependencies in the image.
```dockerfile
RUN pip install --no-cache-dir -r requirements.txt
```
This line uses `pip`, Python's package installer, to install the packages
listed in `requirements.txt`. The `--no-cache-dir` option disables
the cache, which reduces the size of the Docker image by not storing the
unnecessary cache data.
5. Run additional commands.
```dockerfile
RUN python -m spacy download en_core_web_sm
```
This step is specific to NLP applications that require the spaCy library. It downloads the `en_core_web_sm` model, which is a small English language model for spaCy. While not needed for this app, it's included for compatibility with other NLP applications that might use this Dockerfile.