AI based - Custom Vision Workplace Safety detection system.
Proctecting human casuality and accidents in any workplace safety is very important and critical. This is also part of AI for Good projects.
AI based - Custom Vision Workplace Safety detection system.
Project is been show cased in these below web sites:
https://microsoft.github.io/ai-at-edge/docs/workplace_safety/
https://azure.github.io/Vision-AI-DevKit-Pages/docs/community_project02/
Featured Project in hackster.io
https://www.hackster.io/balabala76/ai-for-good-workplace-safety-ca0ea5
Purdue Paper:
https://github.com/balakreshnan/WorkplaceSafety/blob/master/PPE_ComplianceDetection_CLF2020_Revx.pdf
10th Learning Factory paper on PPE published:
https://www.sciencedirect.com/science/article/pii/S2351978920310556?via%3Dihub
note: if referenced or showcased in other web site and if you want to display above please email me.
Ability to detect vision-based compliance. In a environment where safety is important for workforce to work there are Compliance rules in place to wear proper gear to safe guard the environment and keep the humans working safe and secured. For example, depending on what manufacturing company the human causality is one thing every one wants to avoid. So, they insist on wearing proper vest, hard hats and safety glass or lab coats and glass to protect. In some cases, may be mask and complete covered suits for chemical spills etc. So, the idea here is to detect human and then see if they are wearing Vest, Hard hats and Safety glass. Usually in manufacturing there are lines where folks can walk and that would be the next future work. Ability to detect humans and compliance and alert management with reporting and Realtime alerts. Also, ability to detect forklift and alert humans on the way to the fork lift operator. The system just not only detect the objects but also ability to store the information for further reporting and analysis.
For example: usually the plants has to provide yearly or quarterly report to OSHA auditors to make sure if there is human causality and what actions were taken not repeat it. Having the picture when the objects were detected and when not detected is super helpful to analyze the data by the auditors. That makes it very simple and easy for auditors. But the main purpose is the plant or factory can be running and there is no downtime or pull work force to go through the auditing process. Usually there is a down time when auditing happens which in this case will reduce and increase productivity and uptime.
It is necessary to detect and provide a report and also it is important to store the data for historical purpose to able to do auditing and also learn from the data. The historical data can be combined with other productivity data and find insights as well. Mostly likely pushing the data in data lake make sense. It is also very important to know how the system is performing. So we need to collect the telemetry and store in Azure SQL and Blob for further processing. We can generate monthly report or weekly report on how many compliance issues were raised. We can also analyze the data and find if the model is performing well or find where is it not.
The scenario can be customized to other use cases like hospitals, chemical plant and various other heavy machinery and mining industries as well.
To get started to add the module to vision kit follow the below link
https://github.com/balakreshnan/WorkplaceSafety/blob/master/CameraTaggingModule/readme.md
For example in a real Factory or Plant or Hospital or any other scenario unless we have pictures it becomes hard to build model. With the above tagging module we can take the real world pictures and use that for training. The above module is based on manually taking pictures so that there is control on the picture taken and storage.
Create a resource group to process
Create Storage blob
Create Azure SQL database
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[visionkitinputs](
[id] [int] IDENTITY(1,1) NOT NULL,
[confidence] [float] NULL,
[label] [nvarchar](2000) NULL,
[EventProcessedUtcTime] [datetime] NULL,
[PartitionId] [int] NULL,
[EventEnqueuedUtcTime] [datetime] NULL,
[MessageId] [nvarchar](250) NULL,
[CorrelationId] [nvarchar](250) NULL,
[ConnectionDeviceId] [nvarchar](250) NULL,
[ConnectionDeviceGenerationId] [nvarchar](2000) NULL,
[EnqueuedTime] [datetime] NULL,
[inserttime] [datetime] NULL
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[visionkitinputs] ADD CONSTRAINT [DF_visionkitinputs_inserttime] DEFAULT (getdate()) FOR [inserttime]
GO
/****** Object: Table [dbo].[visionkitcount] Script Date: 9/29/2019 7:24:20 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[visionkitcount](
[id] [int] IDENTITY(1,1) NOT NULL,
[Avgconfidence] [float] NULL,
[label] [nvarchar](2000) NULL,
[EventProcessedUtcTime] [datetime] NULL,
[PartitionId] [int] NULL,
[EventEnqueuedUtcTime] [datetime] NULL,
[MessageId] [nvarchar](250) NULL,
[CorrelationId] [nvarchar](250) NULL,
[ConnectionDeviceId] [nvarchar](250) NULL,
[ConnectionDeviceGenerationId] [nvarchar](2000) NULL,
[EnqueuedTime] [datetime] NULL,
[count] [int] NULL,
[inserttime] [datetime] NULL
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[visionkitcount] ADD CONSTRAINT [DF_visionkitcount_inserttime] DEFAULT (getdate()) FOR [inserttime]
GO
Create Stream Analytics
WITH visiondata AS (
SELECT
confidence
,label
,EventProcessedUtcTime
,PartitionId
,EventEnqueuedUtcTime
,IoTHub.MessageId as MessageId
,IoTHub.CorrelationId as CorrelationId
,IoTHub.ConnectionDeviceId as ConnectionDeviceId
,IoTHub.ConnectionDeviceGenerationId as ConnectionDeviceGenerationId
,IoTHub.EnqueuedTime as EnqueuedTime
FROM
input
)
SELECT confidence,label,EventProcessedUtcTime,
PartitionId,EventEnqueuedUtcTime,
MessageId,CorrelationId,ConnectionDeviceId,
ConnectionDeviceGenerationId,EnqueuedTime INTO outputblob FROM visiondata
SELECT confidence,label,EventProcessedUtcTime,
PartitionId,EventEnqueuedUtcTime,
MessageId,CorrelationId,ConnectionDeviceId,
ConnectionDeviceGenerationId,EnqueuedTime INTO sqloutput FROM visiondata
SELECT ConnectionDeviceId,label,
avg(confidence) as Avgconfidence,
count(*) as count,
MIN(CAST(EventEnqueuedUtcTime AS DATETIME)) as EventEnqueuedUtcTime,
MIN(CAST(EventProcessedUtcTime AS DATETIME)) as EventProcessedUtcTime,
MIN(CAST(EnqueuedTime AS DATETIME)) as EnqueuedTime
INTO sqlaggr
FROM visiondata
GROUP BY TUMBLINGWINDOW(second,60),ConnectionDeviceId,label
- First part is CTE with is common table expression
- To send the output into multiple output a temporary CTE is created and then selected columns are used to push that into Azure SQL table and Blob storage.
- The 2 Select query is one for Blob output and other for Azure SQL output.
- I am also saving Device meta data information to further reporting.
Create Web App to display information