U-SQL has cognitive capabilities to analyse pictures of persons to detect age, gender and emotions. How do they work and do I need Azure Cognitive Service?
U-SQL Cognitive Capabilities |
Solution
Good news is that you only need Azure Data Lake (Analytics and Store) with a U-SQL job. Downside is that U-SQL does not yet have the full functionality of Azure Cognitive Services, but all the basics are available. In a previous blog post we showed the basics of the cognitive capabilities in U-SQL and an example of tagging images to add descriptive labels to it. If you never used U-SQL before then first read that post. This follow-up post continues with two new examples. Detecting emotions and detecting age & gender .
Starting point
The starting point of this blog post is an Azure Data Lake Store (ADLS) with a collection of 'random' pictures of humans. We have a folder called 'faces' that contains random images which we wil use for these next two examples.
Test faces |
1) Emotions Script
The emotion script scans the pictures for faces and then tries to determine the emotion of each face (anger, contempt, disgust, fear, happiness, neutral, sadness, surprise). For each face it shows where it was located in the picture and then shows its emotion and the confidence rate for that emotion.
Me a few weeks ago at a party |
Referencing assemblies
For emotion scanning we need one extra reference called "ImageEmotion".
// Needed for image extraction and emotions REFERENCE ASSEMBLY ImageCommon; REFERENCE ASSEMBLY ImageEmotion;
Extract image files
This code, to extract image files from an ADLS container, is exactly the same as in the previous examples .
// Get the image data from ADLS container @images = EXTRACT FileName string, ImgData byte[] FROM @"/faces/{FileName}.jpg" USING new Cognition.Vision.ImageExtractor();
Transform data
Scanning the images for faces and their emotion is done by cross joining the images rowset to the EmotionApplier method. The column names, datatypes and column order are fixed, but you can add aliases for different column names or change the order in the SELECT part of the query.
The query returns one record per face on the image. Besides the emotion you also get a confidence rate, the number of faces, the face number and the position on the image.
// Query detects emotion and the confidence // If there are multiple faces it creates // one record for each face. It also show // the position of the face on the picture. @emotions = SELECT FileName.ToLower() AS FileName, Details.NumFaces, Details.FaceIndex, Details.RectX, Details.RectY, Details.Width, Details.Height, Details.Emotion, Details.Confidence FROM @images CROSS APPLY USING new Cognition.Vision.EmotionApplier() AS Details( NumFaces int, FaceIndex int, RectX float, RectY float, Width float, Height float, Emotion string, Confidence float);
Output data
This is the same code as in the previous examples to output the detected emotions to a file in an ADLS container.
// Output the emotions rowset to a CSV file // located in the Azure Data Lake Store OUTPUT @emotions TO "/faces/emotions.csv" ORDER BY FileName USING Outputters.Csv(outputHeader: true);Download the complete script here.
Now the emotion script is ready to run. Click on the submit button and wait for the job to finish. This could take a few moments! Then browse to the ADLS folder and preview the file to see the result.
The result with in red the happy man from above |
2) Age/gender Script
The age/gender script scans the pictures for faces and then tries to determine the age en gender of each face. It is very similar to the emotion script.
Me at 43 |
Referencing assemblies
For age and gender scanning we need one extra reference called "FaceSdk".
// Needed for image extraction and age/gender REFERENCE ASSEMBLY ImageCommon; REFERENCE ASSEMBLY FaceSdk;
Extract image files
Again the same code as in the previous examples to extract image files from an ADLS container.
// Get the image data from ADLS container @images = EXTRACT FileName string, ImgData byte[] FROM @"/faces/{FileName}.jpg" USING new Cognition.Vision.ImageExtractor();
Transform data
Scanning the images for age and gender and their emotion is done by cross joining the images rowset to the EmotionApplier method. The columnnames, datatypes and order are fixed, but you can add aliases for different columnnames.
The query returns one record per face on the image. Besides the age and gender you also get the number of faces, the face number and the position on the image.
// Query detects age and gender // If there are multiple faces it creates // one record for each face. It also show // the position of the face on the picture. @faces_analyzed = SELECT FileName.ToLower() AS FileName, Details.NumFaces, Details.FaceIndex, Details.RectX, Details.RectY, Details.Width, Details.Height, Details.FaceAge, Details.FaceGender FROM @images CROSS APPLY USING new Cognition.Vision.FaceDetectionApplier() AS Details( NumFaces int, FaceIndex int, RectX float, RectY float, Width float, Height float, FaceAge int, FaceGender string);
Output data
Outputting the data to ADLS uses the same code as in the previous examples.
// Output the gender and age rowset to a CSV file // located in the Azure Data Lake Store OUTPUT @faces_analyzed TO "/faces/agegender.csv" USING Outputters.Csv(outputHeader: true);Download the complete script here.
The result
Now the age and gender script is ready to run. Click on the submit button and wait for the job to finish. This could take a few moments! Then browse to the ADLS folder and preview the file to see the result.
The result with my photo in red |
Summary
This post showed you how to use U-SQL to detect emotion, age and gender from pictures. The next step could be to join these examples in one big script. When you want to try that, keep in mind that the ON clause uses two = instead of one (C# instead of TSQL): ON a.FileName == e.FileName. If you want to try these scripts your self, then you can only do that in the Azure portal. The U-SQL projects for Visual Studio do not yet support these extensions.
As said before the functionality in U-SQL is not yet the same as in Azure Cognitive Services which has much more options (and there my age was estimated at 39 with the same picture). Hopefully this will change, but for now the basics are working. Keep an eye on the Data Lake topic page where we will post new examples when more functionality is available.
This post showed you how to use U-SQL to detect emotion, age and gender from pictures. The next step could be to join these examples in one big script. When you want to try that, keep in mind that the ON clause uses two = instead of one (C# instead of TSQL): ON a.FileName == e.FileName. If you want to try these scripts your self, then you can only do that in the Azure portal. The U-SQL projects for Visual Studio do not yet support these extensions.
As said before the functionality in U-SQL is not yet the same as in Azure Cognitive Services which has much more options (and there my age was estimated at 39 with the same picture). Hopefully this will change, but for now the basics are working. Keep an eye on the Data Lake topic page where we will post new examples when more functionality is available.
No comments:
Post a Comment
All comments will be verified first to avoid URL spammers. यूआरएल स्पैमर से बचने के लिए सभी टिप्पणियों को पहले सत्यापित किया जाएगा।