Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
46 commits
Select commit Hold shift + click to select a range
4eab42a
feat(lidar-service): add containerized sensor microservice with MQTT/…
naman-ranka Jan 28, 2025
b14d53a
feat(weight-service): add Dockerized weight sensor microservice
naman-ranka Jan 28, 2025
43881b1
feat(analytics-service): add peripheral analytics microservice for Li…
naman-ranka Jan 28, 2025
885ee58
feat: add Grafana container for data visualization
naman-ranka Jan 28, 2025
80d6aff
fix(dependencies): update requests to 2.32.0 to resolve CVE-2024-35195
naman-ranka Jan 30, 2025
82ca6c3
chore(license): add Intel copyright notice to new Python files
naman-ranka Jan 30, 2025
69c9b08
refactor(common-sources): consolidate sensor directories and unify pu…
naman-ranka Feb 1, 2025
28bd544
feat(grafana): add dashboard for LiDAR and weight sensor data analysis
naman-ranka Feb 1, 2025
775d015
Merge branch 'main' into I-537
naman-ranka Feb 4, 2025
713d736
feat(grafana): Add datasource provisioning and update dashboard config
naman-ranka Feb 6, 2025
db874cd
feat(common-service): Add HTTP publish test and update docker-compose
naman-ranka Feb 6, 2025
0d6d176
feat(kafka): Update Kafka setup and add unit test for message consump…
naman-ranka Feb 7, 2025
f6ed7e3
docs(common-service): Add README file with setup and usage instructions
naman-ranka Feb 7, 2025
cb33398
fix(flask): Remove debug mode for security
naman-ranka Feb 7, 2025
c7416d6
feat(grafana): Use volumes for provisioning dashboards and datasources
naman-ranka Feb 7, 2025
adb98be
feat(mqtt): Use volumes for Mosquitto configuration instead of custom…
naman-ranka Feb 10, 2025
e6d2a77
Update src/common-service/readme.md
naman-ranka Feb 12, 2025
5d8a003
Update src/common-service/readme.md
naman-ranka Feb 12, 2025
be9473e
update(README.md) change readme file name as per naming conventions
naman-ranka Feb 12, 2025
6b52676
update - fix small issues raised by reviewdog
naman-ranka Feb 12, 2025
c29bf3b
Update src/docker-compose.yml
naman-ranka Feb 13, 2025
ea6be69
update requirements.txt
naman-ranka Feb 14, 2025
b8ac9b0
feat(lidar-service): add containerized sensor microservice with MQTT/…
naman-ranka Jan 28, 2025
9569eb3
feat(weight-service): add Dockerized weight sensor microservice
naman-ranka Jan 28, 2025
66155e6
feat(analytics-service): add peripheral analytics microservice for Li…
naman-ranka Jan 28, 2025
2343059
feat: add Grafana container for data visualization
naman-ranka Jan 28, 2025
237eb9d
fix(dependencies): update requests to 2.32.0 to resolve CVE-2024-35195
naman-ranka Jan 30, 2025
b105b3a
chore(license): add Intel copyright notice to new Python files
naman-ranka Jan 30, 2025
c2765cb
refactor(common-sources): consolidate sensor directories and unify pu…
naman-ranka Feb 1, 2025
b62b600
feat(grafana): add dashboard for LiDAR and weight sensor data analysis
naman-ranka Feb 1, 2025
2ab2901
feat(grafana): Add datasource provisioning and update dashboard config
naman-ranka Feb 6, 2025
eaadb3c
feat(common-service): Add HTTP publish test and update docker-compose
naman-ranka Feb 6, 2025
78f7bca
feat(kafka): Update Kafka setup and add unit test for message consump…
naman-ranka Feb 7, 2025
2bc4a6e
docs(common-service): Add README file with setup and usage instructions
naman-ranka Feb 7, 2025
20089de
fix(flask): Remove debug mode for security
naman-ranka Feb 7, 2025
c222aeb
feat(grafana): Use volumes for provisioning dashboards and datasources
naman-ranka Feb 7, 2025
53eee2d
feat(mqtt): Use volumes for Mosquitto configuration instead of custom…
naman-ranka Feb 10, 2025
ca1eaa6
Update src/common-service/readme.md
naman-ranka Feb 12, 2025
c3d41ae
Update src/common-service/readme.md
naman-ranka Feb 12, 2025
6961fcf
update(README.md) change readme file name as per naming conventions
naman-ranka Feb 12, 2025
dadf060
update - fix small issues raised by reviewdog
naman-ranka Feb 12, 2025
83fe5a2
Update src/docker-compose.yml
naman-ranka Feb 13, 2025
95ed28e
update requirements.txt
naman-ranka Feb 14, 2025
7922533
Merge branch 'I-537' of github.com:naman-ranka/automated-self-checkou…
naman-ranka Feb 14, 2025
0a606ac
add copyright
naman-ranka Feb 14, 2025
a332b07
Docs: Add Sensor Analytics and Visualization Section to README
google-labs-jules[bot] Nov 25, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 30 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,36 @@ stop containers:
make down
```

## Sensor Analytics and Visualization

This project includes a `common-service` for managing LiDAR and Weight sensors, which can be configured to publish data to MQTT, Kafka, and HTTP. It also includes Grafana for real-time data visualization over MQTT.

### Overview

* **Common-Service**: A single container handles both LiDAR and Weight sensors. Each sensor has its own configuration for sensor ID, port, and mock mode.
* **Data Publishing**: The service can publish data to MQTT, Kafka, and HTTP, and is fully controlled via `docker-compose.yml`.
* **Grafana Integration**: A Grafana container is provided with a preloaded "Sensor-Analytics" dashboard and an MQTT data source.

### How to Use

1. **Start all services:**

```
make run-demo
```

2. **Access Grafana:**

* Go to [http://localhost:3000](http://localhost:3000)
* Default Credentials: `admin` / `admin`

3. **View the Dashboard:**

* Look for "Sensor-Analytics" in the Dashboards list.
* If you see no data, go to **Configuration > Data Sources** and confirm the MQTT data source URI is set to `tcp://mqtt-broker_1:1883` or `tcp://mqtt-broker:1883` (depending on your Docker network name).

For more advanced configuration and testing of Kafka and HTTP publishing, please refer to the `README.md` file in the `src/common-service` directory.

## [Advanced Documentation](https://intel-retail.github.io/documentation/use-cases/automated-self-checkout/automated-self-checkout.html)

## Join the community
Expand Down
30 changes: 30 additions & 0 deletions src/common-service/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
#
# Copyright (C) 2025 Intel Corporation.
#
# SPDX-License-Identifier: Apache-2.0
#

# Use an official Python runtime as a parent image
FROM python:3.10-slim

# Set environment variables to prevent Python from writing pyc files and buffering stdout/stderr
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1

# Install system dependencies
RUN apt-get update && apt-get install -y gcc

# Set the working directory in the container
WORKDIR /app

# Copy the requirements file into the container
COPY requirements.txt /app/

# Install Python dependencies
RUN pip install --no-cache-dir -r requirements.txt

# Copy the rest of the application code into the container
COPY . /app/

# run both apps in parallel
CMD ["/bin/bash", "-c", "python lidar_app.py & python weight_app.py & wait"]
128 changes: 128 additions & 0 deletions src/common-service/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
# Common-Service: LiDAR & Weight Sensor Microservice
This microservice manages **both LiDAR and Weight sensors** in a single container. It publishes sensor data over **MQTT** , **Kafka** , or **HTTP** (or any combination), controlled entirely by environment variables.
## 1. Overview

- **Sensors**
- LiDAR & Weight support in the same codebase.

- Configuration for each sensor (e.g., ID, port, mock mode, intervals).

- **Publishing**
- `publisher.py` handles publishing to one or more protocols:
- **MQTT**

- **Kafka**

- **HTTP**

- **Apps**
- Two main modules:
- `lidar_app.py`

- `weight_app.py`

- Each uses shared methods from `publisher.py` & `config.py`.

## 2. Environment Variables
All settings are defined in `docker-compose.yml` under the `asc_common_service` section. Key variables include:
### LiDAR
| Variable | Description | Example |
| --- | --- | --- |
| LIDAR_COUNT | Number of LiDAR sensors | 2 |
| LIDAR_SENSOR_ID_1 | Unique ID for first LiDAR sensor | lidar-001 |
| LIDAR_SENSOR_ID_2 | Unique ID for second LiDAR sensor (if any) | lidar-002 |
| LIDAR_MOCK_1 | Enable mock data for first LiDAR sensor (true/false) | true |
| LIDAR_MQTT_ENABLE | Toggle MQTT publishing | true |
| LIDAR_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker or mqtt-broker_1 |
| LIDAR_MQTT_BROKER_PORT | MQTT broker port | 1883 |
| LIDAR_KAFKA_ENABLE | Toggle Kafka publishing | true |
| KAFKA_BOOTSTRAP_SERVERS | Kafka bootstrap server addresses | kafka:9093 |
| LIDAR_KAFKA_TOPIC | Kafka topic name for LiDAR data | lidar-data |
| LIDAR_HTTP_ENABLE | Toggle HTTP publishing | true |
| LIDAR_HTTP_URL | HTTP endpoint URL for LiDAR data | http://localhost:5000/api/lidar_data |
| LIDAR_PUBLISH_INTERVAL | Interval (in seconds) for LiDAR data publishing | 1.0 |
| LIDAR_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO |

### Weight
| Variable | Description | Example |
| --- | --- | --- |
| WEIGHT_COUNT | Number of Weight sensors | 2 |
| WEIGHT_SENSOR_ID_1 | Unique ID for first Weight sensor | weight-001 |
| WEIGHT_SENSOR_ID_2 | Unique ID for second Weight sensor (if any) | weight-002 |
| WEIGHT_MOCK_1 | Enable mock data for first Weight sensor (true/false) | true |
| WEIGHT_MQTT_ENABLE | Toggle MQTT publishing | true |
| WEIGHT_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker_1 |
| WEIGHT_MQTT_BROKER_PORT | MQTT broker port | 1883 |
| WEIGHT_KAFKA_ENABLE | Toggle Kafka publishing | false |
| WEIGHT_MQTT_TOPIC | MQTT topic name for Weight data | weight/data |
| WEIGHT_HTTP_ENABLE | Toggle HTTP publishing | false |
| WEIGHT_PUBLISH_INTERVAL | Interval (in seconds) for Weight data publishing | 1.0 |
| WEIGHT_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO |

> **Note:** Change `"true"` or `"false"` to enable or disable each protocol. Adjust intervals, logging levels, or sensor counts as needed.
## 3. Usage

1. **Build and Run **

```bash
make run-demo
```
This spins up the `asc_common_service` container (and related services like Mosquitto or Kafka, depending on your configuration).

2. **Data Flow**
- By default, LiDAR publishes to `lidar/data` (MQTT, if enabled) or `lidar-data` (Kafka), or an HTTP endpoint if configured.

- Weight sensor similarly publishes to `weight/data` or `weight-data`.

3. **Mock Mode**
- Setting `LIDAR_MOCK_1="true"` (or `WEIGHT_MOCK_1="true"`) forces the sensor to generate **random** data rather than reading from actual hardware.

## 4. Testing

### A. MQTT

- **Grafana** : A pre-loaded dashboard named *Sensor-Analytics* is available at [http://localhost:3000](http://localhost:3000/) (default credentials `admin`/`admin`).

- Check that the MQTT data source in Grafana points to `tcp://mqtt-broker_1:1883` (or `tcp://mqtt-broker:1883`, depending on the network).

### B. Kafka

- Enable Kafka for LiDAR/Weight by setting `LIDAR_KAFKA_ENABLE="true"` and/or `WEIGHT_KAFKA_ENABLE="true"`.

- Test from inside the container:

```bash
docker exec asc_common_service python kafka_publisher_test.py --topic lidar-data
```
You should see incoming messages in the console.

### C. HTTP

1️ **Local Test (Inside Docker)**

- Set `LIDAR_HTTP_URL="http://localhost:5000/api/lidar_data"` in the environment.
- Run `make run-demo` and wait for all containers to start.
- Once up, execute:

```bash
docker exec asc_common_service python http_publisher_test.py
```

- This will trigger the HTTP publisher and display the received data inside the container.

2️ **Using an External Webhook Service**

- Visit [Webhook.site](https://webhook.site/) and get a unique URL.
- Set `LIDAR_HTTP_URL` to this URL.
- Run `make run-demo`, and you should see the HTTP requests arriving on the Webhook.site dashboard.



## 5. Contributing & Development

- **Code Structure**
- `publisher.py`: Core publishing logic (MQTT, Kafka, HTTP).

- `config.py`: Loads environment variables and configures each sensor.

- `lidar_app.py` and `weight_app.py`: Sensor-specific logic.
110 changes: 110 additions & 0 deletions src/common-service/config/config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
# common/config.py
import os
import logging
from typing import Dict

def read_lidar_config() -> Dict:
"""
Read environment variables for LiDAR configuration
and return them as a dictionary.
"""
config = {
"lidar_count": int(os.getenv("LIDAR_COUNT", "1")),
"lidar_sensors": [],
"publishers": {
"mqtt": {
"enable": os.getenv("LIDAR_MQTT_ENABLE", "false").lower() == "true",
"host": os.getenv("LIDAR_MQTT_BROKER_HOST", "localhost"),
"port": int(os.getenv("LIDAR_MQTT_BROKER_PORT", "1883")),
"topic": os.getenv("LIDAR_MQTT_TOPIC", "lidar/data")
},
"http": {
"enable": os.getenv("LIDAR_HTTP_ENABLE", "false").lower() == "true",
"url": os.getenv("LIDAR_HTTP_URL", "")
},
"kafka": {
"enable": os.getenv("LIDAR_KAFKA_ENABLE", "false").lower() == "true",
"bootstrap_servers": os.getenv("KAFKA_BOOTSTRAP_SERVERS", "localhost:9092"),
"topic": os.getenv("LIDAR_KAFKA_TOPIC", "lidar-data")
}
},
"global": {
"log_level": os.getenv("LIDAR_LOG_LEVEL", "INFO"),
"publish_interval": float(os.getenv("LIDAR_PUBLISH_INTERVAL", "1.0"))
}
}

# Load individual LiDAR sensor configurations
for i in range(1, config["lidar_count"] + 1):
sensor = {
"id": os.getenv(f"LIDAR_SENSOR_ID_{i}", f"lidar-{i:03}"),
"port": os.getenv(f"LIDAR_PORT_{i}", f"/dev/ttyUSB{i-1}"),
"mock": os.getenv(f"LIDAR_MOCK_{i}", "true").lower() == "true",
"publish_interval": float(
os.getenv(f"LIDAR_PUBLISH_INTERVAL_{i}", config["global"]["publish_interval"])
)
}
config["lidar_sensors"].append(sensor)

return config


def read_weight_config() -> Dict:
"""
Read environment variables for Weight Sensor configuration
and return them as a dictionary.
"""
config = {
"weight_count": int(os.getenv("WEIGHT_COUNT", "1")),
"weight_sensors": [],
"publishers": {
"mqtt": {
"enable": os.getenv("WEIGHT_MQTT_ENABLE", "false").lower() == "true",
"host": os.getenv("WEIGHT_MQTT_BROKER_HOST", "localhost"),
"port": int(os.getenv("WEIGHT_MQTT_BROKER_PORT", "1883")),
"topic": os.getenv("WEIGHT_MQTT_TOPIC", "weight/data")
},
"http": {
"enable": os.getenv("WEIGHT_HTTP_ENABLE", "false").lower() == "true",
"url": os.getenv("WEIGHT_HTTP_URL", "")
},
"kafka": {
"enable": os.getenv("WEIGHT_KAFKA_ENABLE", "false").lower() == "true",
"bootstrap_servers": os.getenv("WEIGHT_KAFKA_BOOTSTRAP_SERVERS", "localhost:9092"),
"topic": os.getenv("WEIGHT_KAFKA_TOPIC", "weight-data")
}
},
"global": {
"log_level": os.getenv("WEIGHT_LOG_LEVEL", "INFO"),
"publish_interval": float(os.getenv("WEIGHT_PUBLISH_INTERVAL", "1.0"))
}
}

# Load individual Weight Sensor configurations
for i in range(1, config["weight_count"] + 1):
sensor = {
"id": os.getenv(f"WEIGHT_SENSOR_ID_{i}", f"weight-{i:03}"),
"port": os.getenv(f"WEIGHT_PORT_{i}", f"/dev/ttyUSB{i-1}"),
"mock": os.getenv(f"WEIGHT_MOCK_{i}", "true").lower() == "true",
"publish_interval": float(
os.getenv(f"WEIGHT_PUBLISH_INTERVAL_{i}", config["global"]["publish_interval"])
)
}
config["weight_sensors"].append(sensor)

return config


def setup_logging(config: Dict):
"""
Configure logging from a given config dictionary.
Expects 'log_level' under config["global"].
"""
level_name = config["global"].get("log_level", "INFO")
level = getattr(logging, level_name, logging.INFO)
logging.basicConfig(
level=level,
format="%(asctime)s [%(levelname)s] %(name)s - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
logging.info(f"Logging configured at {level_name} level")
Loading
Loading