Skip to content

Instantly share code, notes, and snippets.

@johnwalicki
Last active June 19, 2019 04:20

Revisions

  1. johnwalicki revised this gist Jun 19, 2019. 1 changed file with 5 additions and 5 deletions.
    10 changes: 5 additions & 5 deletions README.md
    Original file line number Diff line number Diff line change
    @@ -2,11 +2,11 @@

    This flow builds a very simple web page / form that prompts the user to create a Watson Visual Recognition Custom Classifier. The web form requires a name for the custom classifier, prompts the user to upload a training set of >10 images of an object and >10 images of a negative training set.

    The flow then uploads the images, creates two zip files and then calls the [Watson Visual Recognition Custom Classifier](https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier) API.
    The flow then uploads the images, creates two zip files and finally calls the [Watson Visual Recognition Custom Classifier](https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier) API.

    To test the Visual Recognition model, the form also optional prompts for an image URL to be analyzed.
    To test the Visual Recognition model, the form optionally prompts for an image URL to be analyzed.

    To test the Visual Recognition model, the form also optional prompts for an image to upload to be analyzed.
    To test the Visual Recognition model, the form optionally prompts for an image to upload to be analyzed.

    ![Watson Visual Recognition Web Form Flow](https://user-images.githubusercontent.com/17571232/59550620-3d511d00-8f5c-11e9-8832-cf3797d113c0.png?raw=true "Watson Visual Recognition Custom Classifier Flow")

    @@ -25,11 +25,11 @@ Here is the web application / form it creates:

    ## Deploy on IBM Cloud Node-RED Starter Kit or Node-RED local

    This flow will run in the IBM Cloud Node-RED Starter Kit or on a local instance of Node-RED. You will need to either bind the Watson Visual Recognition service to your IBM Cloud application or paste the Watson Visual Recognition API key into the Watson Visual Recognition nodes in the flow.
    This flow will run in the [IBM Cloud Node-RED Starter Kit](https://cloud.ibm.com/catalog/starters/node-red-starter) or on a local instance of Node-RED. You will need to either bind the Watson Visual Recognition service to your IBM Cloud application or paste the Watson Visual Recognition API key into the Watson Visual Recognition nodes in the flow.

    ## Testing your Watson Visual Recognition Custom Classifier with Node-RED Web App

    - This flow creates a Node-RED web form at **/visualrecognition** which you can use to upload an image or paste a URL link to analyze.
    - This flow creates a Node-RED web form at **http://127.0.0.1:1880/visualrecognition** or **http://your-node-red-app.mybluemix.net/visualrecognition** which you can use to upload an image or paste a URL link to analyze.

    ## Testing your Watson Visual Recognition Custom Classifier model

  2. johnwalicki revised this gist Jun 15, 2019. 2 changed files with 679 additions and 2 deletions.
    45 changes: 44 additions & 1 deletion README.md
    Original file line number Diff line number Diff line change
    @@ -1 +1,44 @@
    This flow builds a very simple web page / form that prompts the user to create a Watson Visual Recognition Custom Classifier. The web form requires a name for the custom classifier, prompts the user to upload a training set of >10 images of an object and >10 images of a negative training set.
    ## Overview

    This flow builds a very simple web page / form that prompts the user to create a Watson Visual Recognition Custom Classifier. The web form requires a name for the custom classifier, prompts the user to upload a training set of >10 images of an object and >10 images of a negative training set.

    The flow then uploads the images, creates two zip files and then calls the [Watson Visual Recognition Custom Classifier](https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier) API.

    To test the Visual Recognition model, the form also optional prompts for an image URL to be analyzed.

    To test the Visual Recognition model, the form also optional prompts for an image to upload to be analyzed.

    ![Watson Visual Recognition Web Form Flow](https://user-images.githubusercontent.com/17571232/59550620-3d511d00-8f5c-11e9-8832-cf3797d113c0.png?raw=true "Watson Visual Recognition Custom Classifier Flow")

    Here is the web application / form it creates:
    ![Watson Visual Recognition Web Form](https://user-images.githubusercontent.com/17571232/59550626-55c13780-8f5c-11e9-845a-507c87150d60.png?raw=true "Watson Visual Recognition Simple Web App")

    ## Prerequistes

    - Register for a free [IBM Cloud Account](http://cloud.ibm.com/registration)
    - Log into [IBM Cloud](http://cloud.ibm.com)
    - Create a [Watson Visual Recognition service](https://cloud.ibm.com/catalog/services/visual-recognition)
    - Returned to the [IBM Cloud Resources Dashboard](https://cloud.ibm.com/resources)
    - Click on your Watson Visual Recognition instance
    - Copy the Watson Visual Recognition API key to your clipboard
    - This flow requires [node-red-contrib-zip](https://flows.nodered.org/node/node-red-contrib-zip) and [node-red-node-watson](https://flows.nodered.org/node/node-red-node-watson)

    ## Deploy on IBM Cloud Node-RED Starter Kit or Node-RED local

    This flow will run in the IBM Cloud Node-RED Starter Kit or on a local instance of Node-RED. You will need to either bind the Watson Visual Recognition service to your IBM Cloud application or paste the Watson Visual Recognition API key into the Watson Visual Recognition nodes in the flow.

    ## Testing your Watson Visual Recognition Custom Classifier with Node-RED Web App

    - This flow creates a Node-RED web form at **/visualrecognition** which you can use to upload an image or paste a URL link to analyze.

    ## Testing your Watson Visual Recognition Custom Classifier model

    - Open your [Watson Visual Recognition instance](https://cloud.ibm.com/resources?search=vision)
    - Click on **Create a Custom Model**
    ![Watson Visual Recognition Service](https://user-images.githubusercontent.com/17571232/59550643-8acd8a00-8f5c-11e9-94cc-275b5f0b7952.png?raw=true "Watson Visual Recognition Service Instance")
    - Scroll down to the **Custom Models** section and click on **Test** to open Watson Studio
    ![Watson Visual Recognition Custom Model](https://user-images.githubusercontent.com/17571232/59550652-ab95df80-8f5c-11e9-8ace-817d6ce19f76.png?raw=true "Watson Visual Recognition Custom Model")
    - Click on the **Test** tab
    ![Watson Visual Recognition Custom Model Overview](https://user-images.githubusercontent.com/17571232/59550655-c49e9080-8f5c-11e9-84d4-16593ab3e4ee.png?raw=true "Watson Visual Recognition Custom Model Overview")
    - Upload test images to validate your trained model
    ![Watson Visual Recognition Custom Model Test](https://user-images.githubusercontent.com/17571232/59550669-df710500-8f5c-11e9-960f-f409e8f77a93.png?raw=true "Watson Visual Recognition Custom Test")
    636 changes: 635 additions & 1 deletion flow.json
    Original file line number Diff line number Diff line change
    @@ -1 +1,635 @@
    []
    [
    {
    "id": "7eeff30a.6e3d1c",
    "type": "tab",
    "label": "Watson Visual Recognition",
    "disabled": false,
    "info": ""
    },
    {
    "id": "2dd9981d.e20cb8",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Extract image URL",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "payload.imageurl",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 610,
    "y": 100,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "571eeb0b.a8b4a4",
    "type": "switch",
    "z": "7eeff30a.6e3d1c",
    "name": "Check image url",
    "property": "payload.imageurl",
    "propertyType": "msg",
    "rules": [
    {
    "t": "null"
    },
    {
    "t": "else"
    }
    ],
    "checkall": "true",
    "outputs": 2,
    "x": 360,
    "y": 60,
    "wires": [
    [
    "28547df4.9ce35a"
    ],
    [
    "2dd9981d.e20cb8"
    ]
    ]
    },
    {
    "id": "1c452e89.35c2d1",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/visualrecognition",
    "method": "get",
    "upload": false,
    "swaggerDoc": "",
    "x": 140,
    "y": 60,
    "wires": [
    [
    "571eeb0b.a8b4a4"
    ]
    ]
    },
    {
    "id": "28547df4.9ce35a",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "Simpe Web Page",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>",
    "x": 810,
    "y": 60,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "56ef5dbe.d9afbc",
    "type": "http response",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "statusCode": "",
    "headers": {},
    "x": 1070,
    "y": 340,
    "wires": []
    },
    {
    "id": "33779adb.e3084e",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "Print msg.result.images",
    "active": true,
    "console": "false",
    "complete": "result.images",
    "x": 630,
    "y": 400,
    "wires": []
    },
    {
    "id": "3aaa0f22.a317d",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Step #1 - Create a Visual Recognition Service",
    "info": "1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key",
    "x": 260,
    "y": 420,
    "wires": []
    },
    {
    "id": "f321cd95.dfd7a8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - Multiple Classifiers",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n",
    "outputs": 1,
    "noerr": 0,
    "x": 670,
    "y": 360,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "55464a02.d2b9f4",
    "type": "visual-recognition-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway.watsonplatform.net/visual-recognition/api",
    "image-feature": "classifyImage",
    "lang": "en",
    "x": 290,
    "y": 380,
    "wires": [
    [
    "33779adb.e3084e",
    "f321cd95.dfd7a8"
    ]
    ]
    },
    {
    "id": "200da38a.be96bc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - One Classifier",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 680,
    "y": 320,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "35370570.d632ea",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 350,
    "y": 180,
    "wires": []
    },
    {
    "id": "8a388039.2d1e",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Simple file upload example",
    "info": "http://localhost:1880/upload",
    "x": 130,
    "y": 180,
    "wires": []
    },
    {
    "id": "58949c54.02b47c",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/uploadsimple_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 130,
    "y": 220,
    "wires": [
    [
    "35370570.d632ea",
    "f1fe18f1.271458"
    ]
    ]
    },
    {
    "id": "f1fe18f1.271458",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "req.files[0].buffer",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 370,
    "y": 220,
    "wires": [
    [
    "8f488946.2633d8"
    ]
    ]
    },
    {
    "id": "8f488946.2633d8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Save Picture Buffer",
    "func": "if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 610,
    "y": 220,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "6d4954fa.ac16cc",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Multiple file upload",
    "info": "",
    "x": 150,
    "y": 480,
    "wires": []
    },
    {
    "id": "85333459.a13e2",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/upload2zip_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 160,
    "y": 520,
    "wires": [
    [
    "3faf2672.a8acf2",
    "d7cf5668.86f84"
    ]
    ]
    },
    {
    "id": "3faf2672.a8acf2",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 390,
    "y": 480,
    "wires": []
    },
    {
    "id": "d7cf5668.86f84",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Construct Zip File attributes",
    "func": "// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];",
    "outputs": 2,
    "noerr": 0,
    "x": 440,
    "y": 520,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ],
    [
    "11a95717.109f19",
    "94f3c37.b02ff4",
    "7c795730.6d26d"
    ]
    ]
    },
    {
    "id": "7c795730.6d26d",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Positive Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 180,
    "y": 620,
    "wires": [
    [
    "61d68c04.1ead5c"
    ]
    ]
    },
    {
    "id": "11a95717.109f19",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Success",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "Zip file created! Watson Visual Recognition is Training a custom classifier",
    "tot": "str"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 740,
    "y": 500,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "94f3c37.b02ff4",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "true",
    "targetType": "full",
    "x": 730,
    "y": 540,
    "wires": []
    },
    {
    "id": "972816d2.00f088",
    "type": "visual-recognition-util-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway-a.watsonplatform.net/visual-recognition/api",
    "image-feature": "createClassifier",
    "x": 500,
    "y": 700,
    "wires": [
    [
    "f6c7cbd7.78b798",
    "ba09e21c.8ead08"
    ]
    ]
    },
    {
    "id": "47169bc6.ad95dc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Prepare to Create a Classifier",
    "func": "// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;",
    "outputs": 1,
    "noerr": 0,
    "x": 190,
    "y": 700,
    "wires": [
    [
    "972816d2.00f088",
    "eb571d37.5e4fd"
    ]
    ]
    },
    {
    "id": "f6c7cbd7.78b798",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "result",
    "targetType": "msg",
    "x": 730,
    "y": 760,
    "wires": []
    },
    {
    "id": "61d68c04.1ead5c",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip 2nd Set of Examples",
    "rules": [
    {
    "t": "set",
    "p": "PositiveExamplesZipped",
    "pt": "msg",
    "to": "payload",
    "tot": "msg"
    },
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "NegativeExamples",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 430,
    "y": 620,
    "wires": [
    [
    "d8f6de11.7f6c9"
    ]
    ]
    },
    {
    "id": "d8f6de11.7f6c9",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Negative Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 690,
    "y": 620,
    "wires": [
    [
    "47169bc6.ad95dc"
    ]
    ]
    },
    {
    "id": "eb571d37.5e4fd",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 470,
    "y": 760,
    "wires": []
    },
    {
    "id": "ba09e21c.8ead08",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "result.classifier_id",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 770,
    "y": 700,
    "wires": [
    [
    "e3d0da66.c3b4b"
    ]
    ]
    },
    {
    "id": "b56287e.3114ef8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Custom Classifier",
    "func": "var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 840,
    "y": 160,
    "wires": [
    [
    "55464a02.d2b9f4",
    "771ce36c.58c1fc"
    ]
    ]
    },
    {
    "id": "771ce36c.58c1fc",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 1070,
    "y": 160,
    "wires": []
    },
    {
    "id": "6f46531e.748754",
    "type": "inject",
    "z": "7eeff30a.6e3d1c",
    "name": "Store a PreBuilt Custom Classifier ID",
    "topic": "",
    "payload": "YourCustomClassifier_1724727066",
    "payloadType": "str",
    "repeat": "",
    "crontab": "",
    "once": false,
    "onceDelay": 0.1,
    "x": 210,
    "y": 820,
    "wires": [
    [
    "fe081768.a87008"
    ]
    ]
    },
    {
    "id": "fe081768.a87008",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "payload",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 510,
    "y": 820,
    "wires": [
    []
    ]
    },
    {
    "id": "e3d0da66.c3b4b",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "Please wait for the {{result.classifier_id}} to complete training.",
    "output": "str",
    "x": 980,
    "y": 700,
    "wires": [
    [
    "caa94b72.4bd62"
    ]
    ]
    },
    {
    "id": "caa94b72.4bd62",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "false",
    "x": 1150,
    "y": 700,
    "wires": []
    }
    ]
  3. johnwalicki revised this gist Jun 14, 2019. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion flow.json
    Original file line number Diff line number Diff line change
    @@ -1 +1 @@
    [{"id":"7eeff30a.6e3d1c","type":"tab","label":"Watson Visual Recognition","disabled":false,"info":""},{"id":"2dd9981d.e20cb8","type":"change","z":"7eeff30a.6e3d1c","name":"Extract image URL","rules":[{"t":"set","p":"payload","pt":"msg","to":"payload.imageurl","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":610,"y":100,"wires":[["b56287e.3114ef8"]]},{"id":"571eeb0b.a8b4a4","type":"switch","z":"7eeff30a.6e3d1c","name":"Check image url","property":"payload.imageurl","propertyType":"msg","rules":[{"t":"null"},{"t":"else"}],"checkall":"true","outputs":2,"x":360,"y":60,"wires":[["28547df4.9ce35a"],["2dd9981d.e20cb8"]]},{"id":"1c452e89.35c2d1","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/visualrecognition","method":"get","upload":false,"swaggerDoc":"","x":140,"y":60,"wires":[["571eeb0b.a8b4a4"]]},{"id":"28547df4.9ce35a","type":"template","z":"7eeff30a.6e3d1c","name":"Simpe Web Page","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>","x":810,"y":60,"wires":[["56ef5dbe.d9afbc"]]},{"id":"56ef5dbe.d9afbc","type":"http response","z":"7eeff30a.6e3d1c","name":"","statusCode":"","headers":{},"x":1070,"y":340,"wires":[]},{"id":"33779adb.e3084e","type":"debug","z":"7eeff30a.6e3d1c","name":"Print msg.result.images","active":true,"console":"false","complete":"result.images","x":630,"y":400,"wires":[]},{"id":"3aaa0f22.a317d","type":"comment","z":"7eeff30a.6e3d1c","name":"Step #1 - Create a Visual Recognition Service","info":"1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key","x":260,"y":420,"wires":[]},{"id":"f321cd95.dfd7a8","type":"function","z":"7eeff30a.6e3d1c","name":"Process Results - Multiple Classifiers","func":"if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n","outputs":1,"noerr":0,"x":670,"y":360,"wires":[["56ef5dbe.d9afbc"]]},{"id":"55464a02.d2b9f4","type":"visual-recognition-v3","z":"7eeff30a.6e3d1c","name":"","vr-service-endpoint":"https://gateway.watsonplatform.net/visual-recognition/api","image-feature":"classifyImage","lang":"en","x":290,"y":380,"wires":[["33779adb.e3084e","f321cd95.dfd7a8"]]},{"id":"200da38a.be96bc","type":"function","z":"7eeff30a.6e3d1c","name":"Process Results - One Classifier","func":"if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;","outputs":1,"noerr":0,"x":680,"y":320,"wires":[["56ef5dbe.d9afbc"]]},{"id":"35370570.d632ea","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"complete":"req.files","x":350,"y":180,"wires":[]},{"id":"8a388039.2d1e","type":"comment","z":"7eeff30a.6e3d1c","name":"Simple file upload example","info":"","x":130,"y":180,"wires":[]},{"id":"58949c54.02b47c","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/uploadsimple_post","method":"post","upload":true,"swaggerDoc":"","x":130,"y":220,"wires":[["35370570.d632ea","f1fe18f1.271458"]]},{"id":"f1fe18f1.271458","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"payload","pt":"msg","to":"req.files[0].buffer","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":370,"y":220,"wires":[["8f488946.2633d8"]]},{"id":"8f488946.2633d8","type":"function","z":"7eeff30a.6e3d1c","name":"Save Picture Buffer","func":"if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;","outputs":1,"noerr":0,"x":610,"y":220,"wires":[["b56287e.3114ef8"]]},{"id":"6d4954fa.ac16cc","type":"comment","z":"7eeff30a.6e3d1c","name":"Multiple file upload","info":"","x":150,"y":480,"wires":[]},{"id":"85333459.a13e2","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/upload2zip_post","method":"post","upload":true,"swaggerDoc":"","x":160,"y":520,"wires":[["3faf2672.a8acf2","d7cf5668.86f84"]]},{"id":"3faf2672.a8acf2","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"complete":"req.files","x":390,"y":480,"wires":[]},{"id":"d7cf5668.86f84","type":"function","z":"7eeff30a.6e3d1c","name":"Construct Zip File attributes","func":"// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];","outputs":2,"noerr":0,"x":440,"y":520,"wires":[["56ef5dbe.d9afbc"],["11a95717.109f19","94f3c37.b02ff4","7c795730.6d26d"]]},{"id":"7c795730.6d26d","type":"zip","z":"7eeff30a.6e3d1c","name":"Zip Positive Examples","mode":"compress","filename":"","outasstring":false,"x":180,"y":620,"wires":[["61d68c04.1ead5c"]]},{"id":"11a95717.109f19","type":"change","z":"7eeff30a.6e3d1c","name":"Success","rules":[{"t":"set","p":"payload","pt":"msg","to":"Zip file created! Watson Visual Recognition is Training a custom classifier","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":740,"y":500,"wires":[["56ef5dbe.d9afbc"]]},{"id":"94f3c37.b02ff4","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","x":730,"y":540,"wires":[]},{"id":"972816d2.00f088","type":"visual-recognition-util-v3","z":"7eeff30a.6e3d1c","name":"","vr-service-endpoint":"https://gateway-a.watsonplatform.net/visual-recognition/api","image-feature":"createClassifier","x":500,"y":700,"wires":[["f6c7cbd7.78b798","ba09e21c.8ead08"]]},{"id":"47169bc6.ad95dc","type":"function","z":"7eeff30a.6e3d1c","name":"Prepare to Create a Classifier","func":"// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;","outputs":1,"noerr":0,"x":190,"y":700,"wires":[["972816d2.00f088","eb571d37.5e4fd"]]},{"id":"f6c7cbd7.78b798","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"result","targetType":"msg","x":730,"y":760,"wires":[]},{"id":"61d68c04.1ead5c","type":"change","z":"7eeff30a.6e3d1c","name":"Zip 2nd Set of Examples","rules":[{"t":"set","p":"PositiveExamplesZipped","pt":"msg","to":"payload","tot":"msg"},{"t":"set","p":"payload","pt":"msg","to":"NegativeExamples","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":430,"y":620,"wires":[["d8f6de11.7f6c9"]]},{"id":"d8f6de11.7f6c9","type":"zip","z":"7eeff30a.6e3d1c","name":"Zip Negative Examples","mode":"compress","filename":"","outasstring":false,"x":690,"y":620,"wires":[["47169bc6.ad95dc"]]},{"id":"eb571d37.5e4fd","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"params","targetType":"msg","x":470,"y":760,"wires":[]},{"id":"ba09e21c.8ead08","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"CustomClassifier","pt":"flow","to":"result.classifier_id","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":770,"y":700,"wires":[["e3d0da66.c3b4b"]]},{"id":"b56287e.3114ef8","type":"function","z":"7eeff30a.6e3d1c","name":"Custom Classifier","func":"var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;","outputs":1,"noerr":0,"x":840,"y":160,"wires":[["55464a02.d2b9f4","771ce36c.58c1fc"]]},{"id":"771ce36c.58c1fc","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"params","targetType":"msg","x":1070,"y":160,"wires":[]},{"id":"6f46531e.748754","type":"inject","z":"7eeff30a.6e3d1c","name":"Store a PreBuilt Custom Classifier ID","topic":"","payload":"YourCustomClassifier_1724727066","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":210,"y":820,"wires":[["fe081768.a87008"]]},{"id":"fe081768.a87008","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"CustomClassifier","pt":"flow","to":"payload","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":510,"y":820,"wires":[[]]},{"id":"e3d0da66.c3b4b","type":"template","z":"7eeff30a.6e3d1c","name":"","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"Please wait for the {{result.classifier_id}} to complete training.","output":"str","x":980,"y":700,"wires":[["caa94b72.4bd62"]]},{"id":"caa94b72.4bd62","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","x":1150,"y":700,"wires":[]}]
    []
  4. johnwalicki revised this gist Jun 14, 2019. 6 changed files with 0 additions and 0 deletions.
    Binary file removed WatsonVisualReco-CustomModel.png
    Binary file not shown.
    Binary file removed WatsonVisualReco-CustomModelOverview.png
    Binary file not shown.
    Binary file removed WatsonVisualReco-CustomModelTest.png
    Binary file not shown.
    Binary file removed WatsonVisualReco-ServiceInstance.png
    Binary file not shown.
    Binary file removed WatsonVisualReco-SimpleWebApp.png
    Binary file not shown.
    Binary file removed WatsonVisualReco-flow-screenshot.png
    Binary file not shown.
  5. johnwalicki revised this gist Jun 14, 2019. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion flow.json
    Original file line number Diff line number Diff line change
    @@ -1 +1 @@
    [{"id":"7eeff30a.6e3d1c","type":"tab","label":"Watson Visual Recognition","disabled":false,"info":""},{"id":"2dd9981d.e20cb8","type":"change","z":"7eeff30a.6e3d1c","name":"Extract image URL","rules":[{"t":"set","p":"payload","pt":"msg","to":"payload.imageurl","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":610,"y":100,"wires":[["b56287e.3114ef8"]]},{"id":"571eeb0b.a8b4a4","type":"switch","z":"7eeff30a.6e3d1c","name":"Check image url","property":"payload.imageurl","propertyType":"msg","rules":[{"t":"null"},{"t":"else"}],"checkall":"true","outputs":2,"x":360,"y":60,"wires":[["28547df4.9ce35a"],["2dd9981d.e20cb8"]]},{"id":"1c452e89.35c2d1","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/visualrecognition","method":"get","upload":false,"swaggerDoc":"","x":140,"y":60,"wires":[["571eeb0b.a8b4a4"]]},{"id":"28547df4.9ce35a","type":"template","z":"7eeff30a.6e3d1c","name":"Simpe Web Page","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>","x":810,"y":60,"wires":[["56ef5dbe.d9afbc"]]},{"id":"56ef5dbe.d9afbc","type":"http response","z":"7eeff30a.6e3d1c","name":"","statusCode":"","headers":{},"x":1070,"y":340,"wires":[]},{"id":"33779adb.e3084e","type":"debug","z":"7eeff30a.6e3d1c","name":"Print msg.result.images","active":true,"console":"false","complete":"result.images","x":630,"y":400,"wires":[]},{"id":"3aaa0f22.a317d","type":"comment","z":"7eeff30a.6e3d1c","name":"Step #1 - Create a Visual Recognition Service","info":"1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key","x":260,"y":420,"wires":[]},{"id":"f321cd95.dfd7a8","type":"function","z":"7eeff30a.6e3d1c","name":"Process Results - Multiple Classifiers","func":"if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n","outputs":1,"noerr":0,"x":670,"y":360,"wires":[["56ef5dbe.d9afbc"]]},{"id":"55464a02.d2b9f4","type":"visual-recognition-v3","z":"7eeff30a.6e3d1c","name":"","vr-service-endpoint":"https://gateway.watsonplatform.net/visual-recognition/api","image-feature":"classifyImage","lang":"en","x":290,"y":380,"wires":[["33779adb.e3084e","f321cd95.dfd7a8"]]},{"id":"200da38a.be96bc","type":"function","z":"7eeff30a.6e3d1c","name":"Process Results - One Classifier","func":"if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;","outputs":1,"noerr":0,"x":680,"y":320,"wires":[["56ef5dbe.d9afbc"]]},{"id":"35370570.d632ea","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"complete":"req.files","x":350,"y":180,"wires":[]},{"id":"8a388039.2d1e","type":"comment","z":"7eeff30a.6e3d1c","name":"Simple file upload example","info":"","x":130,"y":180,"wires":[]},{"id":"58949c54.02b47c","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/uploadsimple_post","method":"post","upload":true,"swaggerDoc":"","x":130,"y":220,"wires":[["35370570.d632ea","f1fe18f1.271458"]]},{"id":"f1fe18f1.271458","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"payload","pt":"msg","to":"req.files[0].buffer","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":370,"y":220,"wires":[["8f488946.2633d8"]]},{"id":"8f488946.2633d8","type":"function","z":"7eeff30a.6e3d1c","name":"Save Picture Buffer","func":"if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;","outputs":1,"noerr":0,"x":610,"y":220,"wires":[["b56287e.3114ef8"]]},{"id":"6d4954fa.ac16cc","type":"comment","z":"7eeff30a.6e3d1c","name":"Multiple file upload","info":"","x":150,"y":480,"wires":[]},{"id":"85333459.a13e2","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/upload2zip_post","method":"post","upload":true,"swaggerDoc":"","x":160,"y":520,"wires":[["3faf2672.a8acf2","d7cf5668.86f84"]]},{"id":"3faf2672.a8acf2","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"complete":"req.files","x":390,"y":480,"wires":[]},{"id":"d7cf5668.86f84","type":"function","z":"7eeff30a.6e3d1c","name":"Construct Zip File attributes","func":"// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];","outputs":2,"noerr":0,"x":440,"y":520,"wires":[["56ef5dbe.d9afbc"],["11a95717.109f19","94f3c37.b02ff4","7c795730.6d26d"]]},{"id":"7c795730.6d26d","type":"zip","z":"7eeff30a.6e3d1c","name":"Zip Positive Examples","mode":"compress","filename":"","outasstring":false,"x":180,"y":620,"wires":[["61d68c04.1ead5c"]]},{"id":"11a95717.109f19","type":"change","z":"7eeff30a.6e3d1c","name":"Success","rules":[{"t":"set","p":"payload","pt":"msg","to":"Zip file created! Watson Visual Recognition is Training a custom classifier","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":740,"y":500,"wires":[["56ef5dbe.d9afbc"]]},{"id":"94f3c37.b02ff4","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","x":730,"y":540,"wires":[]},{"id":"972816d2.00f088","type":"visual-recognition-util-v3","z":"7eeff30a.6e3d1c","name":"","vr-service-endpoint":"https://gateway-a.watsonplatform.net/visual-recognition/api","image-feature":"createClassifier","x":500,"y":700,"wires":[["f6c7cbd7.78b798","ba09e21c.8ead08"]]},{"id":"47169bc6.ad95dc","type":"function","z":"7eeff30a.6e3d1c","name":"Prepare to Create a Classifier","func":"// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;","outputs":1,"noerr":0,"x":190,"y":700,"wires":[["972816d2.00f088","eb571d37.5e4fd"]]},{"id":"f6c7cbd7.78b798","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"result","targetType":"msg","x":730,"y":760,"wires":[]},{"id":"61d68c04.1ead5c","type":"change","z":"7eeff30a.6e3d1c","name":"Zip 2nd Set of Examples","rules":[{"t":"set","p":"PositiveExamplesZipped","pt":"msg","to":"payload","tot":"msg"},{"t":"set","p":"payload","pt":"msg","to":"NegativeExamples","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":430,"y":620,"wires":[["d8f6de11.7f6c9"]]},{"id":"d8f6de11.7f6c9","type":"zip","z":"7eeff30a.6e3d1c","name":"Zip Negative Examples","mode":"compress","filename":"","outasstring":false,"x":690,"y":620,"wires":[["47169bc6.ad95dc"]]},{"id":"eb571d37.5e4fd","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"params","targetType":"msg","x":470,"y":760,"wires":[]},{"id":"ba09e21c.8ead08","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"CustomClassifier","pt":"flow","to":"result.classifier_id","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":770,"y":700,"wires":[["e3d0da66.c3b4b"]]},{"id":"b56287e.3114ef8","type":"function","z":"7eeff30a.6e3d1c","name":"Custom Classifier","func":"var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;","outputs":1,"noerr":0,"x":840,"y":160,"wires":[["55464a02.d2b9f4","771ce36c.58c1fc"]]},{"id":"771ce36c.58c1fc","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"params","targetType":"msg","x":1070,"y":160,"wires":[]},{"id":"6f46531e.748754","type":"inject","z":"7eeff30a.6e3d1c","name":"Store a PreBuilt Custom Classifier ID","topic":"","payload":"YourCustomClassifier_1724727066","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":210,"y":820,"wires":[["fe081768.a87008"]]},{"id":"fe081768.a87008","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"CustomClassifier","pt":"flow","to":"payload","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":510,"y":820,"wires":[[]]},{"id":"e3d0da66.c3b4b","type":"template","z":"7eeff30a.6e3d1c","name":"","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"Please wait for the {{result.classifier_id}} to complete training.","output":"str","x":980,"y":700,"wires":[["caa94b72.4bd62"]]},{"id":"caa94b72.4bd62","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","x":1150,"y":700,"wires":[]}]
    [{"id":"7eeff30a.6e3d1c","type":"tab","label":"Watson Visual Recognition","disabled":false,"info":""},{"id":"2dd9981d.e20cb8","type":"change","z":"7eeff30a.6e3d1c","name":"Extract image URL","rules":[{"t":"set","p":"payload","pt":"msg","to":"payload.imageurl","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":610,"y":100,"wires":[["b56287e.3114ef8"]]},{"id":"571eeb0b.a8b4a4","type":"switch","z":"7eeff30a.6e3d1c","name":"Check image url","property":"payload.imageurl","propertyType":"msg","rules":[{"t":"null"},{"t":"else"}],"checkall":"true","outputs":2,"x":360,"y":60,"wires":[["28547df4.9ce35a"],["2dd9981d.e20cb8"]]},{"id":"1c452e89.35c2d1","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/visualrecognition","method":"get","upload":false,"swaggerDoc":"","x":140,"y":60,"wires":[["571eeb0b.a8b4a4"]]},{"id":"28547df4.9ce35a","type":"template","z":"7eeff30a.6e3d1c","name":"Simpe Web Page","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>","x":810,"y":60,"wires":[["56ef5dbe.d9afbc"]]},{"id":"56ef5dbe.d9afbc","type":"http response","z":"7eeff30a.6e3d1c","name":"","statusCode":"","headers":{},"x":1070,"y":340,"wires":[]},{"id":"33779adb.e3084e","type":"debug","z":"7eeff30a.6e3d1c","name":"Print msg.result.images","active":true,"console":"false","complete":"result.images","x":630,"y":400,"wires":[]},{"id":"3aaa0f22.a317d","type":"comment","z":"7eeff30a.6e3d1c","name":"Step #1 - Create a Visual Recognition Service","info":"1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key","x":260,"y":420,"wires":[]},{"id":"f321cd95.dfd7a8","type":"function","z":"7eeff30a.6e3d1c","name":"Process Results - Multiple Classifiers","func":"if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n","outputs":1,"noerr":0,"x":670,"y":360,"wires":[["56ef5dbe.d9afbc"]]},{"id":"55464a02.d2b9f4","type":"visual-recognition-v3","z":"7eeff30a.6e3d1c","name":"","vr-service-endpoint":"https://gateway.watsonplatform.net/visual-recognition/api","image-feature":"classifyImage","lang":"en","x":290,"y":380,"wires":[["33779adb.e3084e","f321cd95.dfd7a8"]]},{"id":"200da38a.be96bc","type":"function","z":"7eeff30a.6e3d1c","name":"Process Results - One Classifier","func":"if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;","outputs":1,"noerr":0,"x":680,"y":320,"wires":[["56ef5dbe.d9afbc"]]},{"id":"35370570.d632ea","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"complete":"req.files","x":350,"y":180,"wires":[]},{"id":"8a388039.2d1e","type":"comment","z":"7eeff30a.6e3d1c","name":"Simple file upload example","info":"","x":130,"y":180,"wires":[]},{"id":"58949c54.02b47c","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/uploadsimple_post","method":"post","upload":true,"swaggerDoc":"","x":130,"y":220,"wires":[["35370570.d632ea","f1fe18f1.271458"]]},{"id":"f1fe18f1.271458","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"payload","pt":"msg","to":"req.files[0].buffer","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":370,"y":220,"wires":[["8f488946.2633d8"]]},{"id":"8f488946.2633d8","type":"function","z":"7eeff30a.6e3d1c","name":"Save Picture Buffer","func":"if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;","outputs":1,"noerr":0,"x":610,"y":220,"wires":[["b56287e.3114ef8"]]},{"id":"6d4954fa.ac16cc","type":"comment","z":"7eeff30a.6e3d1c","name":"Multiple file upload","info":"","x":150,"y":480,"wires":[]},{"id":"85333459.a13e2","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/upload2zip_post","method":"post","upload":true,"swaggerDoc":"","x":160,"y":520,"wires":[["3faf2672.a8acf2","d7cf5668.86f84"]]},{"id":"3faf2672.a8acf2","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"complete":"req.files","x":390,"y":480,"wires":[]},{"id":"d7cf5668.86f84","type":"function","z":"7eeff30a.6e3d1c","name":"Construct Zip File attributes","func":"// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];","outputs":2,"noerr":0,"x":440,"y":520,"wires":[["56ef5dbe.d9afbc"],["11a95717.109f19","94f3c37.b02ff4","7c795730.6d26d"]]},{"id":"7c795730.6d26d","type":"zip","z":"7eeff30a.6e3d1c","name":"Zip Positive Examples","mode":"compress","filename":"","outasstring":false,"x":180,"y":620,"wires":[["61d68c04.1ead5c"]]},{"id":"11a95717.109f19","type":"change","z":"7eeff30a.6e3d1c","name":"Success","rules":[{"t":"set","p":"payload","pt":"msg","to":"Zip file created! Watson Visual Recognition is Training a custom classifier","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":740,"y":500,"wires":[["56ef5dbe.d9afbc"]]},{"id":"94f3c37.b02ff4","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","x":730,"y":540,"wires":[]},{"id":"972816d2.00f088","type":"visual-recognition-util-v3","z":"7eeff30a.6e3d1c","name":"","vr-service-endpoint":"https://gateway-a.watsonplatform.net/visual-recognition/api","image-feature":"createClassifier","x":500,"y":700,"wires":[["f6c7cbd7.78b798","ba09e21c.8ead08"]]},{"id":"47169bc6.ad95dc","type":"function","z":"7eeff30a.6e3d1c","name":"Prepare to Create a Classifier","func":"// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;","outputs":1,"noerr":0,"x":190,"y":700,"wires":[["972816d2.00f088","eb571d37.5e4fd"]]},{"id":"f6c7cbd7.78b798","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"result","targetType":"msg","x":730,"y":760,"wires":[]},{"id":"61d68c04.1ead5c","type":"change","z":"7eeff30a.6e3d1c","name":"Zip 2nd Set of Examples","rules":[{"t":"set","p":"PositiveExamplesZipped","pt":"msg","to":"payload","tot":"msg"},{"t":"set","p":"payload","pt":"msg","to":"NegativeExamples","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":430,"y":620,"wires":[["d8f6de11.7f6c9"]]},{"id":"d8f6de11.7f6c9","type":"zip","z":"7eeff30a.6e3d1c","name":"Zip Negative Examples","mode":"compress","filename":"","outasstring":false,"x":690,"y":620,"wires":[["47169bc6.ad95dc"]]},{"id":"eb571d37.5e4fd","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"params","targetType":"msg","x":470,"y":760,"wires":[]},{"id":"ba09e21c.8ead08","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"CustomClassifier","pt":"flow","to":"result.classifier_id","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":770,"y":700,"wires":[["e3d0da66.c3b4b"]]},{"id":"b56287e.3114ef8","type":"function","z":"7eeff30a.6e3d1c","name":"Custom Classifier","func":"var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;","outputs":1,"noerr":0,"x":840,"y":160,"wires":[["55464a02.d2b9f4","771ce36c.58c1fc"]]},{"id":"771ce36c.58c1fc","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"params","targetType":"msg","x":1070,"y":160,"wires":[]},{"id":"6f46531e.748754","type":"inject","z":"7eeff30a.6e3d1c","name":"Store a PreBuilt Custom Classifier ID","topic":"","payload":"YourCustomClassifier_1724727066","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":210,"y":820,"wires":[["fe081768.a87008"]]},{"id":"fe081768.a87008","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"CustomClassifier","pt":"flow","to":"payload","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":510,"y":820,"wires":[[]]},{"id":"e3d0da66.c3b4b","type":"template","z":"7eeff30a.6e3d1c","name":"","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"Please wait for the {{result.classifier_id}} to complete training.","output":"str","x":980,"y":700,"wires":[["caa94b72.4bd62"]]},{"id":"caa94b72.4bd62","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","x":1150,"y":700,"wires":[]}]
  6. johnwalicki revised this gist Jun 14, 2019. 1 changed file with 1 addition and 635 deletions.
    636 changes: 1 addition & 635 deletions flow.json
    Original file line number Diff line number Diff line change
    @@ -1,635 +1 @@
    [
    {
    "id": "7eeff30a.6e3d1c",
    "type": "tab",
    "label": "Watson Visual Recognition",
    "disabled": false,
    "info": ""
    },
    {
    "id": "2dd9981d.e20cb8",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Extract image URL",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "payload.imageurl",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 610,
    "y": 100,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "571eeb0b.a8b4a4",
    "type": "switch",
    "z": "7eeff30a.6e3d1c",
    "name": "Check image url",
    "property": "payload.imageurl",
    "propertyType": "msg",
    "rules": [
    {
    "t": "null"
    },
    {
    "t": "else"
    }
    ],
    "checkall": "true",
    "outputs": 2,
    "x": 360,
    "y": 60,
    "wires": [
    [
    "28547df4.9ce35a"
    ],
    [
    "2dd9981d.e20cb8"
    ]
    ]
    },
    {
    "id": "1c452e89.35c2d1",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/visualrecognition",
    "method": "get",
    "upload": false,
    "swaggerDoc": "",
    "x": 140,
    "y": 60,
    "wires": [
    [
    "571eeb0b.a8b4a4"
    ]
    ]
    },
    {
    "id": "28547df4.9ce35a",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "Simpe Web Page",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>",
    "x": 810,
    "y": 60,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "56ef5dbe.d9afbc",
    "type": "http response",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "statusCode": "",
    "headers": {},
    "x": 1070,
    "y": 340,
    "wires": []
    },
    {
    "id": "33779adb.e3084e",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "Print msg.result.images",
    "active": true,
    "console": "false",
    "complete": "result.images",
    "x": 630,
    "y": 400,
    "wires": []
    },
    {
    "id": "3aaa0f22.a317d",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Step #1 - Create a Visual Recognition Service",
    "info": "1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key",
    "x": 260,
    "y": 420,
    "wires": []
    },
    {
    "id": "f321cd95.dfd7a8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - Multiple Classifiers",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n",
    "outputs": 1,
    "noerr": 0,
    "x": 670,
    "y": 360,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "55464a02.d2b9f4",
    "type": "visual-recognition-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway.watsonplatform.net/visual-recognition/api",
    "image-feature": "classifyImage",
    "lang": "en",
    "x": 290,
    "y": 380,
    "wires": [
    [
    "33779adb.e3084e",
    "f321cd95.dfd7a8"
    ]
    ]
    },
    {
    "id": "200da38a.be96bc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - One Classifier",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 680,
    "y": 320,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "35370570.d632ea",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 350,
    "y": 180,
    "wires": []
    },
    {
    "id": "8a388039.2d1e",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Simple file upload example",
    "info": "http://localhost:1880/upload",
    "x": 130,
    "y": 180,
    "wires": []
    },
    {
    "id": "58949c54.02b47c",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/uploadsimple_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 130,
    "y": 220,
    "wires": [
    [
    "35370570.d632ea",
    "f1fe18f1.271458"
    ]
    ]
    },
    {
    "id": "f1fe18f1.271458",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "req.files[0].buffer",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 370,
    "y": 220,
    "wires": [
    [
    "8f488946.2633d8"
    ]
    ]
    },
    {
    "id": "8f488946.2633d8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Save Picture Buffer",
    "func": "if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 610,
    "y": 220,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "6d4954fa.ac16cc",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Multiple file upload",
    "info": "",
    "x": 150,
    "y": 480,
    "wires": []
    },
    {
    "id": "85333459.a13e2",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/upload2zip_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 160,
    "y": 520,
    "wires": [
    [
    "3faf2672.a8acf2",
    "d7cf5668.86f84"
    ]
    ]
    },
    {
    "id": "3faf2672.a8acf2",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 390,
    "y": 480,
    "wires": []
    },
    {
    "id": "d7cf5668.86f84",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Construct Zip File attributes",
    "func": "// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];",
    "outputs": 2,
    "noerr": 0,
    "x": 440,
    "y": 520,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ],
    [
    "11a95717.109f19",
    "94f3c37.b02ff4",
    "7c795730.6d26d"
    ]
    ]
    },
    {
    "id": "7c795730.6d26d",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Positive Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 180,
    "y": 620,
    "wires": [
    [
    "61d68c04.1ead5c"
    ]
    ]
    },
    {
    "id": "11a95717.109f19",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Success",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "Zip file created! Watson Visual Recognition is Training a custom classifier",
    "tot": "str"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 740,
    "y": 500,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "94f3c37.b02ff4",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "true",
    "targetType": "full",
    "x": 730,
    "y": 540,
    "wires": []
    },
    {
    "id": "972816d2.00f088",
    "type": "visual-recognition-util-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway-a.watsonplatform.net/visual-recognition/api",
    "image-feature": "createClassifier",
    "x": 500,
    "y": 700,
    "wires": [
    [
    "f6c7cbd7.78b798",
    "ba09e21c.8ead08"
    ]
    ]
    },
    {
    "id": "47169bc6.ad95dc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Prepare to Create a Classifier",
    "func": "// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;",
    "outputs": 1,
    "noerr": 0,
    "x": 190,
    "y": 700,
    "wires": [
    [
    "972816d2.00f088",
    "eb571d37.5e4fd"
    ]
    ]
    },
    {
    "id": "f6c7cbd7.78b798",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "result",
    "targetType": "msg",
    "x": 730,
    "y": 760,
    "wires": []
    },
    {
    "id": "61d68c04.1ead5c",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip 2nd Set of Examples",
    "rules": [
    {
    "t": "set",
    "p": "PositiveExamplesZipped",
    "pt": "msg",
    "to": "payload",
    "tot": "msg"
    },
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "NegativeExamples",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 430,
    "y": 620,
    "wires": [
    [
    "d8f6de11.7f6c9"
    ]
    ]
    },
    {
    "id": "d8f6de11.7f6c9",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Negative Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 690,
    "y": 620,
    "wires": [
    [
    "47169bc6.ad95dc"
    ]
    ]
    },
    {
    "id": "eb571d37.5e4fd",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 470,
    "y": 760,
    "wires": []
    },
    {
    "id": "ba09e21c.8ead08",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "result.classifier_id",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 770,
    "y": 700,
    "wires": [
    [
    "e3d0da66.c3b4b"
    ]
    ]
    },
    {
    "id": "b56287e.3114ef8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Custom Classifier",
    "func": "var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 840,
    "y": 160,
    "wires": [
    [
    "55464a02.d2b9f4",
    "771ce36c.58c1fc"
    ]
    ]
    },
    {
    "id": "771ce36c.58c1fc",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 1070,
    "y": 160,
    "wires": []
    },
    {
    "id": "6f46531e.748754",
    "type": "inject",
    "z": "7eeff30a.6e3d1c",
    "name": "Store a PreBuilt Custom Classifier ID",
    "topic": "",
    "payload": "YourCustomClassifier_1724727066",
    "payloadType": "str",
    "repeat": "",
    "crontab": "",
    "once": false,
    "onceDelay": 0.1,
    "x": 210,
    "y": 820,
    "wires": [
    [
    "fe081768.a87008"
    ]
    ]
    },
    {
    "id": "fe081768.a87008",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "payload",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 510,
    "y": 820,
    "wires": [
    []
    ]
    },
    {
    "id": "e3d0da66.c3b4b",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "Please wait for the {{result.classifier_id}} to complete training.",
    "output": "str",
    "x": 980,
    "y": 700,
    "wires": [
    [
    "caa94b72.4bd62"
    ]
    ]
    },
    {
    "id": "caa94b72.4bd62",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "false",
    "x": 1150,
    "y": 700,
    "wires": []
    }
    ]
    [{"id":"7eeff30a.6e3d1c","type":"tab","label":"Watson Visual Recognition","disabled":false,"info":""},{"id":"2dd9981d.e20cb8","type":"change","z":"7eeff30a.6e3d1c","name":"Extract image URL","rules":[{"t":"set","p":"payload","pt":"msg","to":"payload.imageurl","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":610,"y":100,"wires":[["b56287e.3114ef8"]]},{"id":"571eeb0b.a8b4a4","type":"switch","z":"7eeff30a.6e3d1c","name":"Check image url","property":"payload.imageurl","propertyType":"msg","rules":[{"t":"null"},{"t":"else"}],"checkall":"true","outputs":2,"x":360,"y":60,"wires":[["28547df4.9ce35a"],["2dd9981d.e20cb8"]]},{"id":"1c452e89.35c2d1","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/visualrecognition","method":"get","upload":false,"swaggerDoc":"","x":140,"y":60,"wires":[["571eeb0b.a8b4a4"]]},{"id":"28547df4.9ce35a","type":"template","z":"7eeff30a.6e3d1c","name":"Simpe Web Page","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>","x":810,"y":60,"wires":[["56ef5dbe.d9afbc"]]},{"id":"56ef5dbe.d9afbc","type":"http response","z":"7eeff30a.6e3d1c","name":"","statusCode":"","headers":{},"x":1070,"y":340,"wires":[]},{"id":"33779adb.e3084e","type":"debug","z":"7eeff30a.6e3d1c","name":"Print msg.result.images","active":true,"console":"false","complete":"result.images","x":630,"y":400,"wires":[]},{"id":"3aaa0f22.a317d","type":"comment","z":"7eeff30a.6e3d1c","name":"Step #1 - Create a Visual Recognition Service","info":"1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key","x":260,"y":420,"wires":[]},{"id":"f321cd95.dfd7a8","type":"function","z":"7eeff30a.6e3d1c","name":"Process Results - Multiple Classifiers","func":"if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n","outputs":1,"noerr":0,"x":670,"y":360,"wires":[["56ef5dbe.d9afbc"]]},{"id":"55464a02.d2b9f4","type":"visual-recognition-v3","z":"7eeff30a.6e3d1c","name":"","vr-service-endpoint":"https://gateway.watsonplatform.net/visual-recognition/api","image-feature":"classifyImage","lang":"en","x":290,"y":380,"wires":[["33779adb.e3084e","f321cd95.dfd7a8"]]},{"id":"200da38a.be96bc","type":"function","z":"7eeff30a.6e3d1c","name":"Process Results - One Classifier","func":"if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;","outputs":1,"noerr":0,"x":680,"y":320,"wires":[["56ef5dbe.d9afbc"]]},{"id":"35370570.d632ea","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"complete":"req.files","x":350,"y":180,"wires":[]},{"id":"8a388039.2d1e","type":"comment","z":"7eeff30a.6e3d1c","name":"Simple file upload example","info":"","x":130,"y":180,"wires":[]},{"id":"58949c54.02b47c","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/uploadsimple_post","method":"post","upload":true,"swaggerDoc":"","x":130,"y":220,"wires":[["35370570.d632ea","f1fe18f1.271458"]]},{"id":"f1fe18f1.271458","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"payload","pt":"msg","to":"req.files[0].buffer","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":370,"y":220,"wires":[["8f488946.2633d8"]]},{"id":"8f488946.2633d8","type":"function","z":"7eeff30a.6e3d1c","name":"Save Picture Buffer","func":"if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;","outputs":1,"noerr":0,"x":610,"y":220,"wires":[["b56287e.3114ef8"]]},{"id":"6d4954fa.ac16cc","type":"comment","z":"7eeff30a.6e3d1c","name":"Multiple file upload","info":"","x":150,"y":480,"wires":[]},{"id":"85333459.a13e2","type":"http in","z":"7eeff30a.6e3d1c","name":"","url":"/upload2zip_post","method":"post","upload":true,"swaggerDoc":"","x":160,"y":520,"wires":[["3faf2672.a8acf2","d7cf5668.86f84"]]},{"id":"3faf2672.a8acf2","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"complete":"req.files","x":390,"y":480,"wires":[]},{"id":"d7cf5668.86f84","type":"function","z":"7eeff30a.6e3d1c","name":"Construct Zip File attributes","func":"// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];","outputs":2,"noerr":0,"x":440,"y":520,"wires":[["56ef5dbe.d9afbc"],["11a95717.109f19","94f3c37.b02ff4","7c795730.6d26d"]]},{"id":"7c795730.6d26d","type":"zip","z":"7eeff30a.6e3d1c","name":"Zip Positive Examples","mode":"compress","filename":"","outasstring":false,"x":180,"y":620,"wires":[["61d68c04.1ead5c"]]},{"id":"11a95717.109f19","type":"change","z":"7eeff30a.6e3d1c","name":"Success","rules":[{"t":"set","p":"payload","pt":"msg","to":"Zip file created! Watson Visual Recognition is Training a custom classifier","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":740,"y":500,"wires":[["56ef5dbe.d9afbc"]]},{"id":"94f3c37.b02ff4","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","x":730,"y":540,"wires":[]},{"id":"972816d2.00f088","type":"visual-recognition-util-v3","z":"7eeff30a.6e3d1c","name":"","vr-service-endpoint":"https://gateway-a.watsonplatform.net/visual-recognition/api","image-feature":"createClassifier","x":500,"y":700,"wires":[["f6c7cbd7.78b798","ba09e21c.8ead08"]]},{"id":"47169bc6.ad95dc","type":"function","z":"7eeff30a.6e3d1c","name":"Prepare to Create a Classifier","func":"// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;","outputs":1,"noerr":0,"x":190,"y":700,"wires":[["972816d2.00f088","eb571d37.5e4fd"]]},{"id":"f6c7cbd7.78b798","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"result","targetType":"msg","x":730,"y":760,"wires":[]},{"id":"61d68c04.1ead5c","type":"change","z":"7eeff30a.6e3d1c","name":"Zip 2nd Set of Examples","rules":[{"t":"set","p":"PositiveExamplesZipped","pt":"msg","to":"payload","tot":"msg"},{"t":"set","p":"payload","pt":"msg","to":"NegativeExamples","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":430,"y":620,"wires":[["d8f6de11.7f6c9"]]},{"id":"d8f6de11.7f6c9","type":"zip","z":"7eeff30a.6e3d1c","name":"Zip Negative Examples","mode":"compress","filename":"","outasstring":false,"x":690,"y":620,"wires":[["47169bc6.ad95dc"]]},{"id":"eb571d37.5e4fd","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"params","targetType":"msg","x":470,"y":760,"wires":[]},{"id":"ba09e21c.8ead08","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"CustomClassifier","pt":"flow","to":"result.classifier_id","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":770,"y":700,"wires":[["e3d0da66.c3b4b"]]},{"id":"b56287e.3114ef8","type":"function","z":"7eeff30a.6e3d1c","name":"Custom Classifier","func":"var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;","outputs":1,"noerr":0,"x":840,"y":160,"wires":[["55464a02.d2b9f4","771ce36c.58c1fc"]]},{"id":"771ce36c.58c1fc","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"params","targetType":"msg","x":1070,"y":160,"wires":[]},{"id":"6f46531e.748754","type":"inject","z":"7eeff30a.6e3d1c","name":"Store a PreBuilt Custom Classifier ID","topic":"","payload":"YourCustomClassifier_1724727066","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":210,"y":820,"wires":[["fe081768.a87008"]]},{"id":"fe081768.a87008","type":"change","z":"7eeff30a.6e3d1c","name":"","rules":[{"t":"set","p":"CustomClassifier","pt":"flow","to":"payload","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":510,"y":820,"wires":[[]]},{"id":"e3d0da66.c3b4b","type":"template","z":"7eeff30a.6e3d1c","name":"","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"Please wait for the {{result.classifier_id}} to complete training.","output":"str","x":980,"y":700,"wires":[["caa94b72.4bd62"]]},{"id":"caa94b72.4bd62","type":"debug","z":"7eeff30a.6e3d1c","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","x":1150,"y":700,"wires":[]}]
  7. johnwalicki revised this gist Jun 14, 2019. 1 changed file with 1 addition and 43 deletions.
    44 changes: 1 addition & 43 deletions README.md
    Original file line number Diff line number Diff line change
    @@ -1,43 +1 @@
    ## Overview

    This flow builds a very simple web page / form that prompts the user to create a Watson Visual Recognition Custom Classifier. The web form requires a name for the custom classifier, prompts the user to upload a training set of >10 images of an object and >10 images of a negative training set.

    The flow then uploads the images, creates two zip files and then calls the [Watson Visual Recognition Custom Classifier](https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier) API.

    To test the Visual Recognition model, the form also optional prompts for an image URL to be analyzed.

    To test the Visual Recognition model, the form also optional prompts for an image to upload to be analyzed.

    ![Watson Visual Recognition Web Form Flow](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/543f4b3bced5cc005ff3770d54f13168679651c7/WatsonVisualReco-flow-screenshot.png?raw=true "Watson Visual Recognition Custom Classifier Flow")

    ![Watson Visual Recognition Web Form](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/26742e61edc4d7f671cb66500b652bb9fb5f56d7/WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Simple Web App")

    ## Prerequistes

    - Register for a free [IBM Cloud Account](http://cloud.ibm.com/registration)
    - Log into [IBM Cloud](http://cloud.ibm.com)
    - Create a [Watson Visual Recognition service](https://cloud.ibm.com/catalog/services/visual-recognition)
    - Returned to the [IBM Cloud Resources Dashboard](https://cloud.ibm.com/resources)
    - Click on your Watson Visual Recognition instance
    - Copy the Watson Visual Recognition API key to your clipboard
    - This flow requires [node-red-contrib-zip](https://flows.nodered.org/node/node-red-contrib-zip) and [node-red-node-watson](https://flows.nodered.org/node/node-red-node-watson)

    ## Deploy on IBM Cloud Node-RED Starter Kit or Node-RED local

    This flow will run in the IBM Cloud Node-RED Starter Kit or on a local instance of Node-RED. You will need to either bind the Watson Visual Recognition service to your IBM Cloud application or paste the Watson Visual Recognition API key into the Watson Visual Recognition nodes in the flow.

    ## Testing your Watson Visual Recognition Custom Classifier with Node-RED Web App

    - This flow creates a Node-RED web form at **/visualrecognition** which you can use to upload an image or paste a URL link to analyze.

    ## Testing your Watson Visual Recognition Custom Classifier model

    - Open your [Watson Visual Recognition instance](https://cloud.ibm.com/resources?search=vision)
    - Click on **Create a Custom Model**
    ![Watson Visual Recognition Service](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/543f4b3bced5cc005ff3770d54f13168679651c7/WatsonVisualReco-ServiceInstance.png?raw=true "Watson Visual Recognition Service Instance")
    - Scroll down to the **Custom Models** section and click on **Test** to open Watson Studio
    ![Watson Visual Recognition Custom Model](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/217a698bef7d694a9fe4e3dfc5b74af54ef0f8ee/WatsonVisualReco-CustomModel.png?raw=true "Watson Visual Recognition Custom Model")
    - Click on the **Test** tab
    ![Watson Visual Recognition Custom Model Overview](https://gist.github.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/217a698bef7d694a9fe4e3dfc5b74af54ef0f8ee/WatsonVisualReco-CustomModelOverview.png?raw=true "Watson Visual Recognition Custom Model Overview")
    - Upload test images to validate your trained model
    ![Watson Visual Recognition Custom Model Test](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/217a698bef7d694a9fe4e3dfc5b74af54ef0f8ee/WatsonVisualReco-CustomModelTest.png?raw=true "Watson Visual Recognition Custom Test")
    This flow builds a very simple web page / form that prompts the user to create a Watson Visual Recognition Custom Classifier. The web form requires a name for the custom classifier, prompts the user to upload a training set of >10 images of an object and >10 images of a negative training set.
  8. johnwalicki revised this gist Jun 14, 2019. 1 changed file with 2 additions and 3 deletions.
    5 changes: 2 additions & 3 deletions README.md
    Original file line number Diff line number Diff line change
    @@ -12,7 +12,6 @@ To test the Visual Recognition model, the form also optional prompts for an imag

    ![Watson Visual Recognition Web Form](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/26742e61edc4d7f671cb66500b652bb9fb5f56d7/WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Simple Web App")


    ## Prerequistes

    - Register for a free [IBM Cloud Account](http://cloud.ibm.com/registration)
    @@ -27,9 +26,9 @@ To test the Visual Recognition model, the form also optional prompts for an imag

    This flow will run in the IBM Cloud Node-RED Starter Kit or on a local instance of Node-RED. You will need to either bind the Watson Visual Recognition service to your IBM Cloud application or paste the Watson Visual Recognition API key into the Watson Visual Recognition nodes in the flow.

    ## Testing your Watson Visual Recognition Custom Classifier model
    ## Testing your Watson Visual Recognition Custom Classifier with Node-RED Web App

    - Use the Node-RED web form at **/visualrecognition** to upload or link to test images.
    - This flow creates a Node-RED web form at **/visualrecognition** which you can use to upload an image or paste a URL link to analyze.

    ## Testing your Watson Visual Recognition Custom Classifier model

  9. johnwalicki revised this gist Jun 14, 2019. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion README.md
    Original file line number Diff line number Diff line change
    @@ -10,7 +10,7 @@ To test the Visual Recognition model, the form also optional prompts for an imag

    ![Watson Visual Recognition Web Form Flow](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/543f4b3bced5cc005ff3770d54f13168679651c7/WatsonVisualReco-flow-screenshot.png?raw=true "Watson Visual Recognition Custom Classifier Flow")

    ![Watson Visual Recognition Web Form](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/543f4b3bced5cc005ff3770d54f13168679651c7/WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Simple Web App")
    ![Watson Visual Recognition Web Form](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/26742e61edc4d7f671cb66500b652bb9fb5f56d7/WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Simple Web App")


    ## Prerequistes
  10. johnwalicki revised this gist Jun 14, 2019. 1 changed file with 0 additions and 0 deletions.
    Binary file modified WatsonVisualReco-SimpleWebApp.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
  11. johnwalicki revised this gist Jun 14, 2019. 6 changed files with 635 additions and 0 deletions.
    Binary file modified WatsonVisualReco-CustomModelTest.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Empty file removed WatsonVisualReco-NodeRED-flow.json
    Empty file.
    Binary file modified WatsonVisualReco-ServiceInstance.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file modified WatsonVisualReco-SimpleWebApp.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file modified WatsonVisualReco-flow-screenshot.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    635 changes: 635 additions & 0 deletions flow.json
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,635 @@
    [
    {
    "id": "7eeff30a.6e3d1c",
    "type": "tab",
    "label": "Watson Visual Recognition",
    "disabled": false,
    "info": ""
    },
    {
    "id": "2dd9981d.e20cb8",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Extract image URL",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "payload.imageurl",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 610,
    "y": 100,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "571eeb0b.a8b4a4",
    "type": "switch",
    "z": "7eeff30a.6e3d1c",
    "name": "Check image url",
    "property": "payload.imageurl",
    "propertyType": "msg",
    "rules": [
    {
    "t": "null"
    },
    {
    "t": "else"
    }
    ],
    "checkall": "true",
    "outputs": 2,
    "x": 360,
    "y": 60,
    "wires": [
    [
    "28547df4.9ce35a"
    ],
    [
    "2dd9981d.e20cb8"
    ]
    ]
    },
    {
    "id": "1c452e89.35c2d1",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/visualrecognition",
    "method": "get",
    "upload": false,
    "swaggerDoc": "",
    "x": 140,
    "y": 60,
    "wires": [
    [
    "571eeb0b.a8b4a4"
    ]
    ]
    },
    {
    "id": "28547df4.9ce35a",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "Simpe Web Page",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>",
    "x": 810,
    "y": 60,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "56ef5dbe.d9afbc",
    "type": "http response",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "statusCode": "",
    "headers": {},
    "x": 1070,
    "y": 340,
    "wires": []
    },
    {
    "id": "33779adb.e3084e",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "Print msg.result.images",
    "active": true,
    "console": "false",
    "complete": "result.images",
    "x": 630,
    "y": 400,
    "wires": []
    },
    {
    "id": "3aaa0f22.a317d",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Step #1 - Create a Visual Recognition Service",
    "info": "1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key",
    "x": 260,
    "y": 420,
    "wires": []
    },
    {
    "id": "f321cd95.dfd7a8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - Multiple Classifiers",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n",
    "outputs": 1,
    "noerr": 0,
    "x": 670,
    "y": 360,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "55464a02.d2b9f4",
    "type": "visual-recognition-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway.watsonplatform.net/visual-recognition/api",
    "image-feature": "classifyImage",
    "lang": "en",
    "x": 290,
    "y": 380,
    "wires": [
    [
    "33779adb.e3084e",
    "f321cd95.dfd7a8"
    ]
    ]
    },
    {
    "id": "200da38a.be96bc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - One Classifier",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 680,
    "y": 320,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "35370570.d632ea",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 350,
    "y": 180,
    "wires": []
    },
    {
    "id": "8a388039.2d1e",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Simple file upload example",
    "info": "http://localhost:1880/upload",
    "x": 130,
    "y": 180,
    "wires": []
    },
    {
    "id": "58949c54.02b47c",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/uploadsimple_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 130,
    "y": 220,
    "wires": [
    [
    "35370570.d632ea",
    "f1fe18f1.271458"
    ]
    ]
    },
    {
    "id": "f1fe18f1.271458",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "req.files[0].buffer",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 370,
    "y": 220,
    "wires": [
    [
    "8f488946.2633d8"
    ]
    ]
    },
    {
    "id": "8f488946.2633d8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Save Picture Buffer",
    "func": "if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 610,
    "y": 220,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "6d4954fa.ac16cc",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Multiple file upload",
    "info": "",
    "x": 150,
    "y": 480,
    "wires": []
    },
    {
    "id": "85333459.a13e2",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/upload2zip_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 160,
    "y": 520,
    "wires": [
    [
    "3faf2672.a8acf2",
    "d7cf5668.86f84"
    ]
    ]
    },
    {
    "id": "3faf2672.a8acf2",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 390,
    "y": 480,
    "wires": []
    },
    {
    "id": "d7cf5668.86f84",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Construct Zip File attributes",
    "func": "// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];",
    "outputs": 2,
    "noerr": 0,
    "x": 440,
    "y": 520,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ],
    [
    "11a95717.109f19",
    "94f3c37.b02ff4",
    "7c795730.6d26d"
    ]
    ]
    },
    {
    "id": "7c795730.6d26d",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Positive Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 180,
    "y": 620,
    "wires": [
    [
    "61d68c04.1ead5c"
    ]
    ]
    },
    {
    "id": "11a95717.109f19",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Success",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "Zip file created! Watson Visual Recognition is Training a custom classifier",
    "tot": "str"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 740,
    "y": 500,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "94f3c37.b02ff4",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "true",
    "targetType": "full",
    "x": 730,
    "y": 540,
    "wires": []
    },
    {
    "id": "972816d2.00f088",
    "type": "visual-recognition-util-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway-a.watsonplatform.net/visual-recognition/api",
    "image-feature": "createClassifier",
    "x": 500,
    "y": 700,
    "wires": [
    [
    "f6c7cbd7.78b798",
    "ba09e21c.8ead08"
    ]
    ]
    },
    {
    "id": "47169bc6.ad95dc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Prepare to Create a Classifier",
    "func": "// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;",
    "outputs": 1,
    "noerr": 0,
    "x": 190,
    "y": 700,
    "wires": [
    [
    "972816d2.00f088",
    "eb571d37.5e4fd"
    ]
    ]
    },
    {
    "id": "f6c7cbd7.78b798",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "result",
    "targetType": "msg",
    "x": 730,
    "y": 760,
    "wires": []
    },
    {
    "id": "61d68c04.1ead5c",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip 2nd Set of Examples",
    "rules": [
    {
    "t": "set",
    "p": "PositiveExamplesZipped",
    "pt": "msg",
    "to": "payload",
    "tot": "msg"
    },
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "NegativeExamples",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 430,
    "y": 620,
    "wires": [
    [
    "d8f6de11.7f6c9"
    ]
    ]
    },
    {
    "id": "d8f6de11.7f6c9",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Negative Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 690,
    "y": 620,
    "wires": [
    [
    "47169bc6.ad95dc"
    ]
    ]
    },
    {
    "id": "eb571d37.5e4fd",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 470,
    "y": 760,
    "wires": []
    },
    {
    "id": "ba09e21c.8ead08",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "result.classifier_id",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 770,
    "y": 700,
    "wires": [
    [
    "e3d0da66.c3b4b"
    ]
    ]
    },
    {
    "id": "b56287e.3114ef8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Custom Classifier",
    "func": "var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 840,
    "y": 160,
    "wires": [
    [
    "55464a02.d2b9f4",
    "771ce36c.58c1fc"
    ]
    ]
    },
    {
    "id": "771ce36c.58c1fc",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 1070,
    "y": 160,
    "wires": []
    },
    {
    "id": "6f46531e.748754",
    "type": "inject",
    "z": "7eeff30a.6e3d1c",
    "name": "Store a PreBuilt Custom Classifier ID",
    "topic": "",
    "payload": "YourCustomClassifier_1724727066",
    "payloadType": "str",
    "repeat": "",
    "crontab": "",
    "once": false,
    "onceDelay": 0.1,
    "x": 210,
    "y": 820,
    "wires": [
    [
    "fe081768.a87008"
    ]
    ]
    },
    {
    "id": "fe081768.a87008",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "payload",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 510,
    "y": 820,
    "wires": [
    []
    ]
    },
    {
    "id": "e3d0da66.c3b4b",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "Please wait for the {{result.classifier_id}} to complete training.",
    "output": "str",
    "x": 980,
    "y": 700,
    "wires": [
    [
    "caa94b72.4bd62"
    ]
    ]
    },
    {
    "id": "caa94b72.4bd62",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "false",
    "x": 1150,
    "y": 700,
    "wires": []
    }
    ]
  12. johnwalicki revised this gist Jun 14, 2019. 2 changed files with 4 additions and 4 deletions.
    8 changes: 4 additions & 4 deletions README.md
    Original file line number Diff line number Diff line change
    @@ -35,10 +35,10 @@ This flow will run in the IBM Cloud Node-RED Starter Kit or on a local instance

    - Open your [Watson Visual Recognition instance](https://cloud.ibm.com/resources?search=vision)
    - Click on **Create a Custom Model**
    ![Watson Visual Recognition Service](WatsonVisualReco-ServiceInstance.png?raw=true "Watson Visual Recognition Service Instance")
    ![Watson Visual Recognition Service](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/543f4b3bced5cc005ff3770d54f13168679651c7/WatsonVisualReco-ServiceInstance.png?raw=true "Watson Visual Recognition Service Instance")
    - Scroll down to the **Custom Models** section and click on **Test** to open Watson Studio
    ![Watson Visual Recognition Custom Model](WatsonVisualReco-CustomModel.png?raw=true "Watson Visual Recognition Custom Model")
    ![Watson Visual Recognition Custom Model](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/217a698bef7d694a9fe4e3dfc5b74af54ef0f8ee/WatsonVisualReco-CustomModel.png?raw=true "Watson Visual Recognition Custom Model")
    - Click on the **Test** tab
    ![Watson Visual Recognition Custom Model Overview](WatsonVisualReco-CustomModelOverview.png?raw=true "Watson Visual Recognition Custom Model Overview")
    ![Watson Visual Recognition Custom Model Overview](https://gist.github.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/217a698bef7d694a9fe4e3dfc5b74af54ef0f8ee/WatsonVisualReco-CustomModelOverview.png?raw=true "Watson Visual Recognition Custom Model Overview")
    - Upload test images to validate your trained model
    ![Watson Visual Recognition Custom Model Test](WatsonVisualReco-CustomModelTest.png?raw=true "Watson Visual Recognition Custom Test")
    ![Watson Visual Recognition Custom Model Test](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/217a698bef7d694a9fe4e3dfc5b74af54ef0f8ee/WatsonVisualReco-CustomModelTest.png?raw=true "Watson Visual Recognition Custom Test")
    Binary file modified WatsonVisualReco-CustomModelTest.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
  13. johnwalicki revised this gist Jun 14, 2019. 6 changed files with 2 additions and 638 deletions.
    5 changes: 2 additions & 3 deletions README.md
    Original file line number Diff line number Diff line change
    @@ -8,10 +8,9 @@ To test the Visual Recognition model, the form also optional prompts for an imag

    To test the Visual Recognition model, the form also optional prompts for an image to upload to be analyzed.

    ![Watson Visual Recognition Web Form Flow](raw/7b8d26459ac64d44ea5bfd831cc942cdc7aff466/WatsonVisualReco-flow-screenshot.png?raw=true "Watson Visual Recognition Custom Classifier Flow")
    ![Watson Visual Recognition Web Form Flow](raw/5e9f36a9b53192a212c67992cacd5097624d45d3/WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Custom Classifier Flow")
    ![Watson Visual Recognition Web Form Flow](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/543f4b3bced5cc005ff3770d54f13168679651c7/WatsonVisualReco-flow-screenshot.png?raw=true "Watson Visual Recognition Custom Classifier Flow")

    ![Watson Visual Recognition Web Form](WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Simple Web App")
    ![Watson Visual Recognition Web Form](https://gist.githubusercontent.com/johnwalicki/fbf4fea94d931b308ab29dbbf11245d8/raw/543f4b3bced5cc005ff3770d54f13168679651c7/WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Simple Web App")


    ## Prerequistes
    Binary file modified WatsonVisualReco-CustomModelTest.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    635 changes: 0 additions & 635 deletions WatsonVisualReco-NodeRED-flow.json
    Original file line number Diff line number Diff line change
    @@ -1,635 +0,0 @@
    [
    {
    "id": "7eeff30a.6e3d1c",
    "type": "tab",
    "label": "Watson Visual Recognition",
    "disabled": false,
    "info": ""
    },
    {
    "id": "2dd9981d.e20cb8",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Extract image URL",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "payload.imageurl",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 610,
    "y": 100,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "571eeb0b.a8b4a4",
    "type": "switch",
    "z": "7eeff30a.6e3d1c",
    "name": "Check image url",
    "property": "payload.imageurl",
    "propertyType": "msg",
    "rules": [
    {
    "t": "null"
    },
    {
    "t": "else"
    }
    ],
    "checkall": "true",
    "outputs": 2,
    "x": 360,
    "y": 60,
    "wires": [
    [
    "28547df4.9ce35a"
    ],
    [
    "2dd9981d.e20cb8"
    ]
    ]
    },
    {
    "id": "1c452e89.35c2d1",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/visualrecognition",
    "method": "get",
    "upload": false,
    "swaggerDoc": "",
    "x": 140,
    "y": 60,
    "wires": [
    [
    "571eeb0b.a8b4a4"
    ]
    ]
    },
    {
    "id": "28547df4.9ce35a",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "Simpe Web Page",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>",
    "x": 810,
    "y": 60,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "56ef5dbe.d9afbc",
    "type": "http response",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "statusCode": "",
    "headers": {},
    "x": 1070,
    "y": 340,
    "wires": []
    },
    {
    "id": "33779adb.e3084e",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "Print msg.result.images",
    "active": true,
    "console": "false",
    "complete": "result.images",
    "x": 630,
    "y": 400,
    "wires": []
    },
    {
    "id": "3aaa0f22.a317d",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Step #1 - Create a Visual Recognition Service",
    "info": "1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key",
    "x": 260,
    "y": 420,
    "wires": []
    },
    {
    "id": "f321cd95.dfd7a8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - Multiple Classifiers",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n",
    "outputs": 1,
    "noerr": 0,
    "x": 670,
    "y": 360,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "55464a02.d2b9f4",
    "type": "visual-recognition-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway.watsonplatform.net/visual-recognition/api",
    "image-feature": "classifyImage",
    "lang": "en",
    "x": 290,
    "y": 380,
    "wires": [
    [
    "33779adb.e3084e",
    "f321cd95.dfd7a8"
    ]
    ]
    },
    {
    "id": "200da38a.be96bc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - One Classifier",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 680,
    "y": 320,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "35370570.d632ea",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 350,
    "y": 180,
    "wires": []
    },
    {
    "id": "8a388039.2d1e",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Simple file upload example",
    "info": "http://localhost:1880/upload",
    "x": 130,
    "y": 180,
    "wires": []
    },
    {
    "id": "58949c54.02b47c",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/uploadsimple_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 130,
    "y": 220,
    "wires": [
    [
    "35370570.d632ea",
    "f1fe18f1.271458"
    ]
    ]
    },
    {
    "id": "f1fe18f1.271458",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "req.files[0].buffer",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 370,
    "y": 220,
    "wires": [
    [
    "8f488946.2633d8"
    ]
    ]
    },
    {
    "id": "8f488946.2633d8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Save Picture Buffer",
    "func": "if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 610,
    "y": 220,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "6d4954fa.ac16cc",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Multiple file upload",
    "info": "",
    "x": 150,
    "y": 480,
    "wires": []
    },
    {
    "id": "85333459.a13e2",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/upload2zip_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 160,
    "y": 520,
    "wires": [
    [
    "3faf2672.a8acf2",
    "d7cf5668.86f84"
    ]
    ]
    },
    {
    "id": "3faf2672.a8acf2",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 390,
    "y": 480,
    "wires": []
    },
    {
    "id": "d7cf5668.86f84",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Construct Zip File attributes",
    "func": "// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];",
    "outputs": 2,
    "noerr": 0,
    "x": 440,
    "y": 520,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ],
    [
    "11a95717.109f19",
    "94f3c37.b02ff4",
    "7c795730.6d26d"
    ]
    ]
    },
    {
    "id": "7c795730.6d26d",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Positive Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 180,
    "y": 620,
    "wires": [
    [
    "61d68c04.1ead5c"
    ]
    ]
    },
    {
    "id": "11a95717.109f19",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Success",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "Zip file created! Watson Visual Recognition is Training a custom classifier",
    "tot": "str"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 740,
    "y": 500,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "94f3c37.b02ff4",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "true",
    "targetType": "full",
    "x": 730,
    "y": 540,
    "wires": []
    },
    {
    "id": "972816d2.00f088",
    "type": "visual-recognition-util-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway-a.watsonplatform.net/visual-recognition/api",
    "image-feature": "createClassifier",
    "x": 500,
    "y": 700,
    "wires": [
    [
    "f6c7cbd7.78b798",
    "ba09e21c.8ead08"
    ]
    ]
    },
    {
    "id": "47169bc6.ad95dc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Prepare to Create a Classifier",
    "func": "// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;",
    "outputs": 1,
    "noerr": 0,
    "x": 190,
    "y": 700,
    "wires": [
    [
    "972816d2.00f088",
    "eb571d37.5e4fd"
    ]
    ]
    },
    {
    "id": "f6c7cbd7.78b798",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "result",
    "targetType": "msg",
    "x": 730,
    "y": 760,
    "wires": []
    },
    {
    "id": "61d68c04.1ead5c",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip 2nd Set of Examples",
    "rules": [
    {
    "t": "set",
    "p": "PositiveExamplesZipped",
    "pt": "msg",
    "to": "payload",
    "tot": "msg"
    },
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "NegativeExamples",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 430,
    "y": 620,
    "wires": [
    [
    "d8f6de11.7f6c9"
    ]
    ]
    },
    {
    "id": "d8f6de11.7f6c9",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Negative Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 690,
    "y": 620,
    "wires": [
    [
    "47169bc6.ad95dc"
    ]
    ]
    },
    {
    "id": "eb571d37.5e4fd",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 470,
    "y": 760,
    "wires": []
    },
    {
    "id": "ba09e21c.8ead08",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "result.classifier_id",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 770,
    "y": 700,
    "wires": [
    [
    "e3d0da66.c3b4b"
    ]
    ]
    },
    {
    "id": "b56287e.3114ef8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Custom Classifier",
    "func": "var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 840,
    "y": 160,
    "wires": [
    [
    "55464a02.d2b9f4",
    "771ce36c.58c1fc"
    ]
    ]
    },
    {
    "id": "771ce36c.58c1fc",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 1070,
    "y": 160,
    "wires": []
    },
    {
    "id": "6f46531e.748754",
    "type": "inject",
    "z": "7eeff30a.6e3d1c",
    "name": "Store a PreBuilt Custom Classifier ID",
    "topic": "",
    "payload": "YourCustomClassifier_1724727066",
    "payloadType": "str",
    "repeat": "",
    "crontab": "",
    "once": false,
    "onceDelay": 0.1,
    "x": 210,
    "y": 820,
    "wires": [
    [
    "fe081768.a87008"
    ]
    ]
    },
    {
    "id": "fe081768.a87008",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "payload",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 510,
    "y": 820,
    "wires": [
    []
    ]
    },
    {
    "id": "e3d0da66.c3b4b",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "Please wait for the {{result.classifier_id}} to complete training.",
    "output": "str",
    "x": 980,
    "y": 700,
    "wires": [
    [
    "caa94b72.4bd62"
    ]
    ]
    },
    {
    "id": "caa94b72.4bd62",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "false",
    "x": 1150,
    "y": 700,
    "wires": []
    }
    ]
    Binary file modified WatsonVisualReco-ServiceInstance.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file modified WatsonVisualReco-SimpleWebApp.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file modified WatsonVisualReco-flow-screenshot.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
  14. johnwalicki revised this gist Jun 14, 2019. 5 changed files with 635 additions and 0 deletions.
    Binary file modified WatsonVisualReco-CustomModelTest.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    635 changes: 635 additions & 0 deletions WatsonVisualReco-NodeRED-flow.json
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,635 @@
    [
    {
    "id": "7eeff30a.6e3d1c",
    "type": "tab",
    "label": "Watson Visual Recognition",
    "disabled": false,
    "info": ""
    },
    {
    "id": "2dd9981d.e20cb8",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Extract image URL",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "payload.imageurl",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 610,
    "y": 100,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "571eeb0b.a8b4a4",
    "type": "switch",
    "z": "7eeff30a.6e3d1c",
    "name": "Check image url",
    "property": "payload.imageurl",
    "propertyType": "msg",
    "rules": [
    {
    "t": "null"
    },
    {
    "t": "else"
    }
    ],
    "checkall": "true",
    "outputs": 2,
    "x": 360,
    "y": 60,
    "wires": [
    [
    "28547df4.9ce35a"
    ],
    [
    "2dd9981d.e20cb8"
    ]
    ]
    },
    {
    "id": "1c452e89.35c2d1",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/visualrecognition",
    "method": "get",
    "upload": false,
    "swaggerDoc": "",
    "x": 140,
    "y": 60,
    "wires": [
    [
    "571eeb0b.a8b4a4"
    ]
    ]
    },
    {
    "id": "28547df4.9ce35a",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "Simpe Web Page",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>",
    "x": 810,
    "y": 60,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "56ef5dbe.d9afbc",
    "type": "http response",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "statusCode": "",
    "headers": {},
    "x": 1070,
    "y": 340,
    "wires": []
    },
    {
    "id": "33779adb.e3084e",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "Print msg.result.images",
    "active": true,
    "console": "false",
    "complete": "result.images",
    "x": 630,
    "y": 400,
    "wires": []
    },
    {
    "id": "3aaa0f22.a317d",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Step #1 - Create a Visual Recognition Service",
    "info": "1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key",
    "x": 260,
    "y": 420,
    "wires": []
    },
    {
    "id": "f321cd95.dfd7a8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - Multiple Classifiers",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n",
    "outputs": 1,
    "noerr": 0,
    "x": 670,
    "y": 360,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "55464a02.d2b9f4",
    "type": "visual-recognition-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway.watsonplatform.net/visual-recognition/api",
    "image-feature": "classifyImage",
    "lang": "en",
    "x": 290,
    "y": 380,
    "wires": [
    [
    "33779adb.e3084e",
    "f321cd95.dfd7a8"
    ]
    ]
    },
    {
    "id": "200da38a.be96bc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - One Classifier",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 680,
    "y": 320,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "35370570.d632ea",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 350,
    "y": 180,
    "wires": []
    },
    {
    "id": "8a388039.2d1e",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Simple file upload example",
    "info": "http://localhost:1880/upload",
    "x": 130,
    "y": 180,
    "wires": []
    },
    {
    "id": "58949c54.02b47c",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/uploadsimple_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 130,
    "y": 220,
    "wires": [
    [
    "35370570.d632ea",
    "f1fe18f1.271458"
    ]
    ]
    },
    {
    "id": "f1fe18f1.271458",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "req.files[0].buffer",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 370,
    "y": 220,
    "wires": [
    [
    "8f488946.2633d8"
    ]
    ]
    },
    {
    "id": "8f488946.2633d8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Save Picture Buffer",
    "func": "if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 610,
    "y": 220,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "6d4954fa.ac16cc",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Multiple file upload",
    "info": "",
    "x": 150,
    "y": 480,
    "wires": []
    },
    {
    "id": "85333459.a13e2",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/upload2zip_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 160,
    "y": 520,
    "wires": [
    [
    "3faf2672.a8acf2",
    "d7cf5668.86f84"
    ]
    ]
    },
    {
    "id": "3faf2672.a8acf2",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 390,
    "y": 480,
    "wires": []
    },
    {
    "id": "d7cf5668.86f84",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Construct Zip File attributes",
    "func": "// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];",
    "outputs": 2,
    "noerr": 0,
    "x": 440,
    "y": 520,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ],
    [
    "11a95717.109f19",
    "94f3c37.b02ff4",
    "7c795730.6d26d"
    ]
    ]
    },
    {
    "id": "7c795730.6d26d",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Positive Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 180,
    "y": 620,
    "wires": [
    [
    "61d68c04.1ead5c"
    ]
    ]
    },
    {
    "id": "11a95717.109f19",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Success",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "Zip file created! Watson Visual Recognition is Training a custom classifier",
    "tot": "str"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 740,
    "y": 500,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "94f3c37.b02ff4",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "true",
    "targetType": "full",
    "x": 730,
    "y": 540,
    "wires": []
    },
    {
    "id": "972816d2.00f088",
    "type": "visual-recognition-util-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway-a.watsonplatform.net/visual-recognition/api",
    "image-feature": "createClassifier",
    "x": 500,
    "y": 700,
    "wires": [
    [
    "f6c7cbd7.78b798",
    "ba09e21c.8ead08"
    ]
    ]
    },
    {
    "id": "47169bc6.ad95dc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Prepare to Create a Classifier",
    "func": "// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;",
    "outputs": 1,
    "noerr": 0,
    "x": 190,
    "y": 700,
    "wires": [
    [
    "972816d2.00f088",
    "eb571d37.5e4fd"
    ]
    ]
    },
    {
    "id": "f6c7cbd7.78b798",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "result",
    "targetType": "msg",
    "x": 730,
    "y": 760,
    "wires": []
    },
    {
    "id": "61d68c04.1ead5c",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip 2nd Set of Examples",
    "rules": [
    {
    "t": "set",
    "p": "PositiveExamplesZipped",
    "pt": "msg",
    "to": "payload",
    "tot": "msg"
    },
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "NegativeExamples",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 430,
    "y": 620,
    "wires": [
    [
    "d8f6de11.7f6c9"
    ]
    ]
    },
    {
    "id": "d8f6de11.7f6c9",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Negative Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 690,
    "y": 620,
    "wires": [
    [
    "47169bc6.ad95dc"
    ]
    ]
    },
    {
    "id": "eb571d37.5e4fd",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 470,
    "y": 760,
    "wires": []
    },
    {
    "id": "ba09e21c.8ead08",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "result.classifier_id",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 770,
    "y": 700,
    "wires": [
    [
    "e3d0da66.c3b4b"
    ]
    ]
    },
    {
    "id": "b56287e.3114ef8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Custom Classifier",
    "func": "var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 840,
    "y": 160,
    "wires": [
    [
    "55464a02.d2b9f4",
    "771ce36c.58c1fc"
    ]
    ]
    },
    {
    "id": "771ce36c.58c1fc",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 1070,
    "y": 160,
    "wires": []
    },
    {
    "id": "6f46531e.748754",
    "type": "inject",
    "z": "7eeff30a.6e3d1c",
    "name": "Store a PreBuilt Custom Classifier ID",
    "topic": "",
    "payload": "YourCustomClassifier_1724727066",
    "payloadType": "str",
    "repeat": "",
    "crontab": "",
    "once": false,
    "onceDelay": 0.1,
    "x": 210,
    "y": 820,
    "wires": [
    [
    "fe081768.a87008"
    ]
    ]
    },
    {
    "id": "fe081768.a87008",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "payload",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 510,
    "y": 820,
    "wires": [
    []
    ]
    },
    {
    "id": "e3d0da66.c3b4b",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "Please wait for the {{result.classifier_id}} to complete training.",
    "output": "str",
    "x": 980,
    "y": 700,
    "wires": [
    [
    "caa94b72.4bd62"
    ]
    ]
    },
    {
    "id": "caa94b72.4bd62",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "false",
    "x": 1150,
    "y": 700,
    "wires": []
    }
    ]
    Binary file modified WatsonVisualReco-ServiceInstance.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file modified WatsonVisualReco-SimpleWebApp.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file modified WatsonVisualReco-flow-screenshot.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
  15. johnwalicki revised this gist Jun 14, 2019. 2 changed files with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion README.md
    Original file line number Diff line number Diff line change
    @@ -9,7 +9,7 @@ To test the Visual Recognition model, the form also optional prompts for an imag
    To test the Visual Recognition model, the form also optional prompts for an image to upload to be analyzed.

    ![Watson Visual Recognition Web Form Flow](raw/7b8d26459ac64d44ea5bfd831cc942cdc7aff466/WatsonVisualReco-flow-screenshot.png?raw=true "Watson Visual Recognition Custom Classifier Flow")
    ![Watson Visual Recognition Web Form Flow](WatsonVisualReco-flow-screenshot.png?raw=true "Watson Visual Recognition Custom Classifier Flow")
    ![Watson Visual Recognition Web Form Flow](raw/5e9f36a9b53192a212c67992cacd5097624d45d3/WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Custom Classifier Flow")

    ![Watson Visual Recognition Web Form](WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Simple Web App")

    Binary file modified WatsonVisualReco-CustomModelTest.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
  16. johnwalicki revised this gist Jun 14, 2019. 6 changed files with 1 addition and 635 deletions.
    1 change: 1 addition & 0 deletions README.md
    Original file line number Diff line number Diff line change
    @@ -8,6 +8,7 @@ To test the Visual Recognition model, the form also optional prompts for an imag

    To test the Visual Recognition model, the form also optional prompts for an image to upload to be analyzed.

    ![Watson Visual Recognition Web Form Flow](raw/7b8d26459ac64d44ea5bfd831cc942cdc7aff466/WatsonVisualReco-flow-screenshot.png?raw=true "Watson Visual Recognition Custom Classifier Flow")
    ![Watson Visual Recognition Web Form Flow](WatsonVisualReco-flow-screenshot.png?raw=true "Watson Visual Recognition Custom Classifier Flow")

    ![Watson Visual Recognition Web Form](WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Simple Web App")
    Binary file modified WatsonVisualReco-CustomModelTest.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file modified WatsonVisualReco-ServiceInstance.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file modified WatsonVisualReco-SimpleWebApp.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file modified WatsonVisualReco-flow-screenshot.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    635 changes: 0 additions & 635 deletions flow.json
    Original file line number Diff line number Diff line change
    @@ -1,635 +0,0 @@
    [
    {
    "id": "7eeff30a.6e3d1c",
    "type": "tab",
    "label": "Watson Visual Recognition",
    "disabled": false,
    "info": ""
    },
    {
    "id": "2dd9981d.e20cb8",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Extract image URL",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "payload.imageurl",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 610,
    "y": 100,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "571eeb0b.a8b4a4",
    "type": "switch",
    "z": "7eeff30a.6e3d1c",
    "name": "Check image url",
    "property": "payload.imageurl",
    "propertyType": "msg",
    "rules": [
    {
    "t": "null"
    },
    {
    "t": "else"
    }
    ],
    "checkall": "true",
    "outputs": 2,
    "x": 360,
    "y": 60,
    "wires": [
    [
    "28547df4.9ce35a"
    ],
    [
    "2dd9981d.e20cb8"
    ]
    ]
    },
    {
    "id": "1c452e89.35c2d1",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/visualrecognition",
    "method": "get",
    "upload": false,
    "swaggerDoc": "",
    "x": 140,
    "y": 60,
    "wires": [
    [
    "571eeb0b.a8b4a4"
    ]
    ]
    },
    {
    "id": "28547df4.9ce35a",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "Simpe Web Page",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>",
    "x": 810,
    "y": 60,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "56ef5dbe.d9afbc",
    "type": "http response",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "statusCode": "",
    "headers": {},
    "x": 1070,
    "y": 340,
    "wires": []
    },
    {
    "id": "33779adb.e3084e",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "Print msg.result.images",
    "active": true,
    "console": "false",
    "complete": "result.images",
    "x": 630,
    "y": 400,
    "wires": []
    },
    {
    "id": "3aaa0f22.a317d",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Step #1 - Create a Visual Recognition Service",
    "info": "1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key",
    "x": 260,
    "y": 420,
    "wires": []
    },
    {
    "id": "f321cd95.dfd7a8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - Multiple Classifiers",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n",
    "outputs": 1,
    "noerr": 0,
    "x": 670,
    "y": 360,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "55464a02.d2b9f4",
    "type": "visual-recognition-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway.watsonplatform.net/visual-recognition/api",
    "image-feature": "classifyImage",
    "lang": "en",
    "x": 290,
    "y": 380,
    "wires": [
    [
    "33779adb.e3084e",
    "f321cd95.dfd7a8"
    ]
    ]
    },
    {
    "id": "200da38a.be96bc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - One Classifier",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 680,
    "y": 320,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "35370570.d632ea",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 350,
    "y": 180,
    "wires": []
    },
    {
    "id": "8a388039.2d1e",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Simple file upload example",
    "info": "http://localhost:1880/upload",
    "x": 130,
    "y": 180,
    "wires": []
    },
    {
    "id": "58949c54.02b47c",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/uploadsimple_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 130,
    "y": 220,
    "wires": [
    [
    "35370570.d632ea",
    "f1fe18f1.271458"
    ]
    ]
    },
    {
    "id": "f1fe18f1.271458",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "req.files[0].buffer",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 370,
    "y": 220,
    "wires": [
    [
    "8f488946.2633d8"
    ]
    ]
    },
    {
    "id": "8f488946.2633d8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Save Picture Buffer",
    "func": "if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 610,
    "y": 220,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "6d4954fa.ac16cc",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Multiple file upload",
    "info": "",
    "x": 150,
    "y": 480,
    "wires": []
    },
    {
    "id": "85333459.a13e2",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/upload2zip_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 160,
    "y": 520,
    "wires": [
    [
    "3faf2672.a8acf2",
    "d7cf5668.86f84"
    ]
    ]
    },
    {
    "id": "3faf2672.a8acf2",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 390,
    "y": 480,
    "wires": []
    },
    {
    "id": "d7cf5668.86f84",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Construct Zip File attributes",
    "func": "// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];",
    "outputs": 2,
    "noerr": 0,
    "x": 440,
    "y": 520,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ],
    [
    "11a95717.109f19",
    "94f3c37.b02ff4",
    "7c795730.6d26d"
    ]
    ]
    },
    {
    "id": "7c795730.6d26d",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Positive Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 180,
    "y": 620,
    "wires": [
    [
    "61d68c04.1ead5c"
    ]
    ]
    },
    {
    "id": "11a95717.109f19",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Success",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "Zip file created! Watson Visual Recognition is Training a custom classifier",
    "tot": "str"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 740,
    "y": 500,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "94f3c37.b02ff4",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "true",
    "targetType": "full",
    "x": 730,
    "y": 540,
    "wires": []
    },
    {
    "id": "972816d2.00f088",
    "type": "visual-recognition-util-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway-a.watsonplatform.net/visual-recognition/api",
    "image-feature": "createClassifier",
    "x": 500,
    "y": 700,
    "wires": [
    [
    "f6c7cbd7.78b798",
    "ba09e21c.8ead08"
    ]
    ]
    },
    {
    "id": "47169bc6.ad95dc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Prepare to Create a Classifier",
    "func": "// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;",
    "outputs": 1,
    "noerr": 0,
    "x": 190,
    "y": 700,
    "wires": [
    [
    "972816d2.00f088",
    "eb571d37.5e4fd"
    ]
    ]
    },
    {
    "id": "f6c7cbd7.78b798",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "result",
    "targetType": "msg",
    "x": 730,
    "y": 760,
    "wires": []
    },
    {
    "id": "61d68c04.1ead5c",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip 2nd Set of Examples",
    "rules": [
    {
    "t": "set",
    "p": "PositiveExamplesZipped",
    "pt": "msg",
    "to": "payload",
    "tot": "msg"
    },
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "NegativeExamples",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 430,
    "y": 620,
    "wires": [
    [
    "d8f6de11.7f6c9"
    ]
    ]
    },
    {
    "id": "d8f6de11.7f6c9",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Negative Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 690,
    "y": 620,
    "wires": [
    [
    "47169bc6.ad95dc"
    ]
    ]
    },
    {
    "id": "eb571d37.5e4fd",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 470,
    "y": 760,
    "wires": []
    },
    {
    "id": "ba09e21c.8ead08",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "result.classifier_id",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 770,
    "y": 700,
    "wires": [
    [
    "e3d0da66.c3b4b"
    ]
    ]
    },
    {
    "id": "b56287e.3114ef8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Custom Classifier",
    "func": "var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 840,
    "y": 160,
    "wires": [
    [
    "55464a02.d2b9f4",
    "771ce36c.58c1fc"
    ]
    ]
    },
    {
    "id": "771ce36c.58c1fc",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 1070,
    "y": 160,
    "wires": []
    },
    {
    "id": "6f46531e.748754",
    "type": "inject",
    "z": "7eeff30a.6e3d1c",
    "name": "Store a PreBuilt Custom Classifier ID",
    "topic": "",
    "payload": "YourCustomClassifier_1724727066",
    "payloadType": "str",
    "repeat": "",
    "crontab": "",
    "once": false,
    "onceDelay": 0.1,
    "x": 210,
    "y": 820,
    "wires": [
    [
    "fe081768.a87008"
    ]
    ]
    },
    {
    "id": "fe081768.a87008",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "payload",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 510,
    "y": 820,
    "wires": [
    []
    ]
    },
    {
    "id": "e3d0da66.c3b4b",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "Please wait for the {{result.classifier_id}} to complete training.",
    "output": "str",
    "x": 980,
    "y": 700,
    "wires": [
    [
    "caa94b72.4bd62"
    ]
    ]
    },
    {
    "id": "caa94b72.4bd62",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "false",
    "x": 1150,
    "y": 700,
    "wires": []
    }
    ]
  17. johnwalicki revised this gist Jun 13, 2019. 7 changed files with 16 additions and 5 deletions.
    21 changes: 16 additions & 5 deletions README.md
    Original file line number Diff line number Diff line change
    @@ -8,26 +8,37 @@ To test the Visual Recognition model, the form also optional prompts for an imag

    To test the Visual Recognition model, the form also optional prompts for an image to upload to be analyzed.

    ![Watson Visual Recognition Web Form Flow](https://github.com/johnwalicki/test.png?raw=true "Watson Visual Recognition Custom Flow")
    ![Watson Visual Recognition Web Form Flow](WatsonVisualReco-flow-screenshot.png?raw=true "Watson Visual Recognition Custom Classifier Flow")

    ![Watson Visual Recognition Web Form](WatsonVisualReco-SimpleWebApp.png?raw=true "Watson Visual Recognition Simple Web App")


    ## Prerequistes

    - Register for a free [IBM Cloud Account](http://cloud.ibm.com/registration)
    - Create a Watson Visual Recognition service instance
    - Log into [IBM Cloud](http://cloud.ibm.com)
    - Create a [Watson Visual Recognition service](https://cloud.ibm.com/catalog/services/visual-recognition)
    - Returned to the [IBM Cloud Resources Dashboard](https://cloud.ibm.com/resources)
    - Click on your Watson Visual Recognition instance
    - Copy the Watson Visual Recognition API key to your clipboard
    - This flow requires [**node-red-contrib-zip**](https://flows.nodered.org/node/node-red-contrib-zip)
    - This flow requires [node-red-contrib-zip](https://flows.nodered.org/node/node-red-contrib-zip) and [node-red-node-watson](https://flows.nodered.org/node/node-red-node-watson)

    ## Deploy on IBM Cloud Node-RED Starter Kit or Node-RED local

    This flow will run in the IBM Cloud Node-RED Starter Kit or on a local instance of Node-RED. You will need to either bind the Watson Visual Recognition service to your IBM Cloud application or paste the Watson Visual Recognition API key into the Watson Visual Recognition nodes in the flow.

    ## Testing your Watson Visual Recognition Custom Classifier model

    - Use the Node-RED web form at **/visualrecognition** to upload or link to test images.

    ## Testing your Watson Visual Recognition Custom Classifier model

    - Open your [Watson Visual Recognition instance](https://cloud.ibm.com/resources?search=vision)
    - Click on **Create a Custom Model**
    - Scroll down to the **Custom Models** section.
    - Click on **Test** to upload test images and validate your trained model
    ![Watson Visual Recognition Service](WatsonVisualReco-ServiceInstance.png?raw=true "Watson Visual Recognition Service Instance")
    - Scroll down to the **Custom Models** section and click on **Test** to open Watson Studio
    ![Watson Visual Recognition Custom Model](WatsonVisualReco-CustomModel.png?raw=true "Watson Visual Recognition Custom Model")
    - Click on the **Test** tab
    ![Watson Visual Recognition Custom Model Overview](WatsonVisualReco-CustomModelOverview.png?raw=true "Watson Visual Recognition Custom Model Overview")
    - Upload test images to validate your trained model
    ![Watson Visual Recognition Custom Model Test](WatsonVisualReco-CustomModelTest.png?raw=true "Watson Visual Recognition Custom Test")
    Binary file added WatsonVisualReco-CustomModel.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file added WatsonVisualReco-CustomModelOverview.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file added WatsonVisualReco-CustomModelTest.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file added WatsonVisualReco-ServiceInstance.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file added WatsonVisualReco-SimpleWebApp.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
    Binary file added WatsonVisualReco-flow-screenshot.png
    Loading
    Sorry, something went wrong. Reload?
    Sorry, we cannot display this file.
    Sorry, this file is invalid so it cannot be displayed.
  18. johnwalicki created this gist Jun 13, 2019.
    33 changes: 33 additions & 0 deletions README.md
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,33 @@
    ## Overview

    This flow builds a very simple web page / form that prompts the user to create a Watson Visual Recognition Custom Classifier. The web form requires a name for the custom classifier, prompts the user to upload a training set of >10 images of an object and >10 images of a negative training set.

    The flow then uploads the images, creates two zip files and then calls the [Watson Visual Recognition Custom Classifier](https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier) API.

    To test the Visual Recognition model, the form also optional prompts for an image URL to be analyzed.

    To test the Visual Recognition model, the form also optional prompts for an image to upload to be analyzed.

    ![Watson Visual Recognition Web Form Flow](https://github.com/johnwalicki/test.png?raw=true "Watson Visual Recognition Custom Flow")

    ## Prerequistes

    - Register for a free [IBM Cloud Account](http://cloud.ibm.com/registration)
    - Create a Watson Visual Recognition service instance
    - Log into [IBM Cloud](http://cloud.ibm.com)
    - Create a [Watson Visual Recognition service](https://cloud.ibm.com/catalog/services/visual-recognition)
    - Returned to the [IBM Cloud Resources Dashboard](https://cloud.ibm.com/resources)
    - Click on your Watson Visual Recognition instance
    - Copy the Watson Visual Recognition API key to your clipboard
    - This flow requires [**node-red-contrib-zip**](https://flows.nodered.org/node/node-red-contrib-zip)

    ## Deploy on IBM Cloud Node-RED Starter Kit or Node-RED local

    This flow will run in the IBM Cloud Node-RED Starter Kit or on a local instance of Node-RED. You will need to either bind the Watson Visual Recognition service to your IBM Cloud application or paste the Watson Visual Recognition API key into the Watson Visual Recognition nodes in the flow.

    ## Testing your Watson Visual Recognition Custom Classifier model

    - Open your [Watson Visual Recognition instance](https://cloud.ibm.com/resources?search=vision)
    - Click on **Create a Custom Model**
    - Scroll down to the **Custom Models** section.
    - Click on **Test** to upload test images and validate your trained model
    635 changes: 635 additions & 0 deletions flow.json
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,635 @@
    [
    {
    "id": "7eeff30a.6e3d1c",
    "type": "tab",
    "label": "Watson Visual Recognition",
    "disabled": false,
    "info": ""
    },
    {
    "id": "2dd9981d.e20cb8",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Extract image URL",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "payload.imageurl",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 610,
    "y": 100,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "571eeb0b.a8b4a4",
    "type": "switch",
    "z": "7eeff30a.6e3d1c",
    "name": "Check image url",
    "property": "payload.imageurl",
    "propertyType": "msg",
    "rules": [
    {
    "t": "null"
    },
    {
    "t": "else"
    }
    ],
    "checkall": "true",
    "outputs": 2,
    "x": 360,
    "y": 60,
    "wires": [
    [
    "28547df4.9ce35a"
    ],
    [
    "2dd9981d.e20cb8"
    ]
    ]
    },
    {
    "id": "1c452e89.35c2d1",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/visualrecognition",
    "method": "get",
    "upload": false,
    "swaggerDoc": "",
    "x": 140,
    "y": 60,
    "wires": [
    [
    "571eeb0b.a8b4a4"
    ]
    ]
    },
    {
    "id": "28547df4.9ce35a",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "Simpe Web Page",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "<h1>Welcome to a Watson Visual Recognition sample image app</h1>\n<hr>\n<h2>Create a Watson Visual Recognition Custom Classifier</h2>\n<p>Upload 10 images and train a Watson Visual Recognition Custom Classifier</p>\n\n<form action=\"/upload2zip_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <br>Step 1: Submit a name for this Custom Classifier:<br>\n <input type=\"text\" name=\"ClassifierName\"/>\n <br><br>Step 2: Select (10 or more) POSITIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Positive\" multiple/>\n <br><br>Step 3: Select (10 or more) NEGATIVE .png/.jpg files to be uploaded and zipped:<br>\n <input type=\"file\" name=\"Negative\" multiple/>\n <br><br>Step 4: Train a custom classifier<br>\n <input type=\"submit\" value=\"Zip and Train\">\n</form>\n<hr>\n<h2>Test Watson Visual Recognition</h2>\n<p>Copy/Paste a URL to any image on the Internet to be classified:</p>\n<form action=\"{{req._parsedUrl.pathname}}\">\n <br/>Paste the URL in the box below.<br/>\n <br>Image URL: <input type=\"text\" name=\"imageurl\"/>\n <input type=\"submit\" value=\"Analyze Image URL\"/>\n</form>\n<hr>\n<p>Upload a file to be classified:</p>\n\n<form action=\"/uploadsimple_post\" method=\"POST\" enctype=\"multipart/form-data\">\n <input type=\"file\" name=\"myFile\"/>\n <input type=\"submit\" value=\"Analyze File\">\n</form>\n<hr>",
    "x": 810,
    "y": 60,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "56ef5dbe.d9afbc",
    "type": "http response",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "statusCode": "",
    "headers": {},
    "x": 1070,
    "y": 340,
    "wires": []
    },
    {
    "id": "33779adb.e3084e",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "Print msg.result.images",
    "active": true,
    "console": "false",
    "complete": "result.images",
    "x": 630,
    "y": 400,
    "wires": []
    },
    {
    "id": "3aaa0f22.a317d",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Step #1 - Create a Visual Recognition Service",
    "info": "1. Log into your Bluemix account\n2. Navigate to the Bluemix Catalog\n3. Scroll to the Watson Services section\n4. Find and click on the Visual Recognition service\n5. Create an unbounded Visual Recognition instance\n6. Open the new service and navigate to the Service Credentials\n7. Copy the api_key to the clipboard\n8. Open the above \"visual recognition v3\" node and paste your new API Key",
    "x": 260,
    "y": 420,
    "wires": []
    },
    {
    "id": "f321cd95.dfd7a8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - Multiple Classifiers",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day)\n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar c_id = 0;\nvar WhichClassifier = [];\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n var bestcolor = -1;\n var colorscore = 0;\n var item = \"\";\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n } \n } \n }\n\n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n if( typeof msg.result.images[0].classifiers[c_id].classes[i] != 'undefined') {\n if( typeof msg.result.images[0].classifiers[c_id].classes[i].class != 'undefined') {\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n // bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n } \n }\n\n if( bestcolor != \"-1\") {\n // found a color\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestcolor].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n }\n bestcolor = -1;\n } else {\n if( msg.result.images[0].classifiers[c_id].classes.length > 0) {\n if( typeof msg.result.images[0].classifiers[c_id].classes[bestItem].class != 'undefined') {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n } \n } \n }\n \n WhichClassifier.push(\"Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".<br>\");\n}\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nif( typeof(msg.result.images[0].resolved_url) != 'undefined' ) {\n msg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\n} else {\n msg.template = \"<p>Analyzed image: \"+ msg.mypic;\n}\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\n// 1st Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[0]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\n\n// More than one classifier?\nif( msg.result.images[0].classifiers.length == 1 ) {\n msg.payload=msg.template;\n return msg;\n}\n\n// Next Classifier\npicInfo = msg.result.images[0].classifiers[1].classes;\narrayLength = picInfo.length;\n\n// 2nd Table\nmsg.template=msg.template+\"<h2>\"+WhichClassifier[1]+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor ( i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload=msg.template;\nreturn msg;\n",
    "outputs": 1,
    "noerr": 0,
    "x": 670,
    "y": 360,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "55464a02.d2b9f4",
    "type": "visual-recognition-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway.watsonplatform.net/visual-recognition/api",
    "image-feature": "classifyImage",
    "lang": "en",
    "x": 290,
    "y": 380,
    "wires": [
    [
    "33779adb.e3084e",
    "f321cd95.dfd7a8"
    ]
    ]
    },
    {
    "id": "200da38a.be96bc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Process Results - One Classifier",
    "func": "if (typeof msg.result == 'undefined') {\n return null;\n}\n\nif (typeof msg.result.error != 'undefined') {\n //The Lite Plan allows users to make 7,500 API calls for free\n // Daily limit is (up to 250 calls per day) \n // {\"status\":\"ERROR\",\"statusInfo\":\"Key is over transaction limit\"}\n msg.template = msg.result.error.message;\n return msg;\n}\n\n// Text Extraction\nif (typeof msg.result.images[0].text != 'undefined') {\n var image_text = msg.result.images[0].text;\n msg.payload = image_text;\n msg.template = image_text;\n if( image_text.length >0 ) {\n msg.template= \"Watson found the words: \"+image_text;\n }\n return msg;\n}\n\nvar bestcolor = -1;\nvar colorscore = 0;\nvar c_id = 0;\nvar say = \"\";\nvar item;\n\nfor ( c_id=0; c_id < (msg.result.images[0].classifiers.length); c_id++ ){\n // find the best color, if any\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > colorscore){\n bestcolor = i;\n colorscore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n var bestItem = 0;\n var itemScore = 0;\n for( i =0; i<(msg.result.images[0].classifiers[c_id].classes.length); i++ ){\n var object = msg.result.images[0].classifiers[c_id].classes[i].class;\n if ( !object.includes(\"color\") ) {\n if( msg.result.images[0].classifiers[c_id].classes[i].score > itemScore){\n// bestItem = i;\n bestItem = 0;\n itemScore = msg.result.images[0].classifiers[c_id].classes[i].score;\n }\n }\n }\n \n if( bestcolor != \"-1\") {\n // found a color\n item = msg.result.images[0].classifiers[c_id].classes[bestcolor].class + \" \" + msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n bestcolor = -1;\n } else {\n item = msg.result.images[0].classifiers[c_id].classes[bestItem].class;\n }\n// say = say + \" Watson's \" + msg.result.images[0].classifiers[c_id].name + \" classifier thinks this picture contains a \" + item +\".\";\n say = say + \" Watson thinks this picture contains a \" + item +\".\";\n}\nmsg.payload = say;\n\nvar picInfo = msg.result.images[0].classifiers[0].classes;\nvar arrayLength = picInfo.length;\nmsg.template=\"<p>Analyzed image: \"+ msg.result.images[0].resolved_url+\"<br/><img src=\"+msg.result.images[0].resolved_url+\" height=\\\"200\\\"/></p>\";\nmsg.template=msg.template+\"<style>\";\nmsg.template=msg.template+\"table { width: 440px; margin-top: 10px; }\";\nmsg.template=msg.template+\"tr:nth-child(even){background-color: #f2f2f2;}\";\nmsg.template=msg.template+\"th, td { padding: 8px; text-align: left; border-bottom: 1px solid #ddd; width: 10%;}\";\nmsg.template=msg.template+\"</style>\";\n\nmsg.template=msg.template+\"<h2>\"+say+\"</h2><table span=100%><tr><th>Class</th><th>Confidence</th></tr>\";\nfor (var i = 0; i < arrayLength; i++) {\n msg.template = msg.template + \"<tr><td>\" + picInfo[i].class + \"</td><td>\" + picInfo[i].score + \"</td></tr>\";\n}\nmsg.template = msg.template + \"</table>\";\nmsg.payload = msg.template;\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 680,
    "y": 320,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "35370570.d632ea",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 350,
    "y": 180,
    "wires": []
    },
    {
    "id": "8a388039.2d1e",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Simple file upload example",
    "info": "http://localhost:1880/upload",
    "x": 130,
    "y": 180,
    "wires": []
    },
    {
    "id": "58949c54.02b47c",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/uploadsimple_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 130,
    "y": 220,
    "wires": [
    [
    "35370570.d632ea",
    "f1fe18f1.271458"
    ]
    ]
    },
    {
    "id": "f1fe18f1.271458",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "req.files[0].buffer",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 370,
    "y": 220,
    "wires": [
    [
    "8f488946.2633d8"
    ]
    ]
    },
    {
    "id": "8f488946.2633d8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Save Picture Buffer",
    "func": "if (msg.req.files[0].mimetype.includes('image')) {\n msg.mypic = `<img src=\"data:image/gif;base64,${msg.payload.toString('base64')}\">`;\n} else {\n msg.payload = msg.payload.toString();\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 610,
    "y": 220,
    "wires": [
    [
    "b56287e.3114ef8"
    ]
    ]
    },
    {
    "id": "6d4954fa.ac16cc",
    "type": "comment",
    "z": "7eeff30a.6e3d1c",
    "name": "Multiple file upload",
    "info": "",
    "x": 150,
    "y": 480,
    "wires": []
    },
    {
    "id": "85333459.a13e2",
    "type": "http in",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "url": "/upload2zip_post",
    "method": "post",
    "upload": true,
    "swaggerDoc": "",
    "x": 160,
    "y": 520,
    "wires": [
    [
    "3faf2672.a8acf2",
    "d7cf5668.86f84"
    ]
    ]
    },
    {
    "id": "3faf2672.a8acf2",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "complete": "req.files",
    "x": 390,
    "y": 480,
    "wires": []
    },
    {
    "id": "d7cf5668.86f84",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Construct Zip File attributes",
    "func": "// Confirm that all the files are images\nvar NumImages = msg.req.files.length ;\nvar AllImages = true;\n\n// Watson Visual Recognition requires a minimum of 10 images\n// to train a custom classifier\nif( NumImages < 2 ) {\n msg.payload = \"Watson Visual Recognition requires a minimum of 10 images to train a custom classifier\";\n return [msg, null] ;\n}\n\nfor( var i = 0; i < NumImages ; i++ ) {\n if ( !msg.req.files[i].mimetype.includes('image')) {\n // At least one file is not an image, throw an error\n AllImages = false ;\n }\n}\nif( !AllImages ) {\n msg.payload = \"Error Not all files are .png / .jpg image files\";\n return [msg, null] ;\n}\n\n// Step 1:\n// Install the node-red-contrib-zip Node-RED node\n//\n// Step 2:\n// Construct a msg.payload of an Array of files to be compressed into a ZIP object.\n// The ZipFile name is specified with msg.filename\n// Array: An array of objects containing 'filename' as a String and 'payload' as a Buffer/String\n// each representing one file in the resultiing zip\n\nvar PosZipArray = [];\nvar NegZipArray = [];\nfor( i = 0; i < NumImages ; i++ ) {\n if( msg.req.files[i].fieldname == \"Positive\") {\n PosZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n } else if ( msg.req.files[i].fieldname == \"Negative\") {\n NegZipArray.push( { \"filename\":msg.req.files[i].originalname, \"payload\":msg.req.files[i].buffer }) ;\n }\n}\nmsg.filename = msg.payload.ClassifierName;\n// Zip the Positive Example files first\nmsg.payload = PosZipArray ;\n// Store the Negative Examples for a second zip\nmsg.NegativeExamples = NegZipArray ;\n\nreturn [null,msg];",
    "outputs": 2,
    "noerr": 0,
    "x": 440,
    "y": 520,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ],
    [
    "11a95717.109f19",
    "94f3c37.b02ff4",
    "7c795730.6d26d"
    ]
    ]
    },
    {
    "id": "7c795730.6d26d",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Positive Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 180,
    "y": 620,
    "wires": [
    [
    "61d68c04.1ead5c"
    ]
    ]
    },
    {
    "id": "11a95717.109f19",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Success",
    "rules": [
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "Zip file created! Watson Visual Recognition is Training a custom classifier",
    "tot": "str"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 740,
    "y": 500,
    "wires": [
    [
    "56ef5dbe.d9afbc"
    ]
    ]
    },
    {
    "id": "94f3c37.b02ff4",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "true",
    "targetType": "full",
    "x": 730,
    "y": 540,
    "wires": []
    },
    {
    "id": "972816d2.00f088",
    "type": "visual-recognition-util-v3",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "vr-service-endpoint": "https://gateway-a.watsonplatform.net/visual-recognition/api",
    "image-feature": "createClassifier",
    "x": 500,
    "y": 700,
    "wires": [
    [
    "f6c7cbd7.78b798",
    "ba09e21c.8ead08"
    ]
    ]
    },
    {
    "id": "47169bc6.ad95dc",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Prepare to Create a Classifier",
    "func": "// Create a Classifier\n// Provide the following input :\n// msg.params[\"name\"] : a string name that will be used as prefix for the returned classifier_id (Required)\n// msg.params[\"{classname}_positive_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images. (Required)\n// msg.params[\"negative_examples\"] : a Node.js binary Buffer of the ZIP that contains a minimum of 10 images.(Optional)\n//\n// More information on this API documentation.\n// https://cloud.ibm.com/apidocs/visual-recognition#create-a-classifier\n\nvar classnamepos = msg.filename+\"_positive_examples\";\nmsg.params = {} ;\nmsg.params.name = msg.filename ;\nmsg.params.negative_examples = msg.payload\nmsg.params[classnamepos] = msg.PositiveExamplesZipped // zip file!\n\n// don't bother sending a big zip file to the Watson Visual Recognition Util node\n//msg.payload = \"\"; \n\nreturn msg ;",
    "outputs": 1,
    "noerr": 0,
    "x": 190,
    "y": 700,
    "wires": [
    [
    "972816d2.00f088",
    "eb571d37.5e4fd"
    ]
    ]
    },
    {
    "id": "f6c7cbd7.78b798",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "result",
    "targetType": "msg",
    "x": 730,
    "y": 760,
    "wires": []
    },
    {
    "id": "61d68c04.1ead5c",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip 2nd Set of Examples",
    "rules": [
    {
    "t": "set",
    "p": "PositiveExamplesZipped",
    "pt": "msg",
    "to": "payload",
    "tot": "msg"
    },
    {
    "t": "set",
    "p": "payload",
    "pt": "msg",
    "to": "NegativeExamples",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 430,
    "y": 620,
    "wires": [
    [
    "d8f6de11.7f6c9"
    ]
    ]
    },
    {
    "id": "d8f6de11.7f6c9",
    "type": "zip",
    "z": "7eeff30a.6e3d1c",
    "name": "Zip Negative Examples",
    "mode": "compress",
    "filename": "",
    "outasstring": false,
    "x": 690,
    "y": 620,
    "wires": [
    [
    "47169bc6.ad95dc"
    ]
    ]
    },
    {
    "id": "eb571d37.5e4fd",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 470,
    "y": 760,
    "wires": []
    },
    {
    "id": "ba09e21c.8ead08",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "result.classifier_id",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 770,
    "y": 700,
    "wires": [
    [
    "e3d0da66.c3b4b"
    ]
    ]
    },
    {
    "id": "b56287e.3114ef8",
    "type": "function",
    "z": "7eeff30a.6e3d1c",
    "name": "Custom Classifier",
    "func": "var CustomClassifier = flow.get(\"CustomClassifier\") || \"\";\nmsg.params = {};\n\n// Check if a Custom Classifier has been trained\nif( CustomClassifier.length ) {\n msg.params.classifier_ids = CustomClassifier + \",default\" ;\n} else {\n msg.params.classifier_ids = \"default\" ;\n}\n\nreturn msg;",
    "outputs": 1,
    "noerr": 0,
    "x": 840,
    "y": 160,
    "wires": [
    [
    "55464a02.d2b9f4",
    "771ce36c.58c1fc"
    ]
    ]
    },
    {
    "id": "771ce36c.58c1fc",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "params",
    "targetType": "msg",
    "x": 1070,
    "y": 160,
    "wires": []
    },
    {
    "id": "6f46531e.748754",
    "type": "inject",
    "z": "7eeff30a.6e3d1c",
    "name": "Store a PreBuilt Custom Classifier ID",
    "topic": "",
    "payload": "YourCustomClassifier_1724727066",
    "payloadType": "str",
    "repeat": "",
    "crontab": "",
    "once": false,
    "onceDelay": 0.1,
    "x": 210,
    "y": 820,
    "wires": [
    [
    "fe081768.a87008"
    ]
    ]
    },
    {
    "id": "fe081768.a87008",
    "type": "change",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "rules": [
    {
    "t": "set",
    "p": "CustomClassifier",
    "pt": "flow",
    "to": "payload",
    "tot": "msg"
    }
    ],
    "action": "",
    "property": "",
    "from": "",
    "to": "",
    "reg": false,
    "x": 510,
    "y": 820,
    "wires": [
    []
    ]
    },
    {
    "id": "e3d0da66.c3b4b",
    "type": "template",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "field": "payload",
    "fieldType": "msg",
    "format": "handlebars",
    "syntax": "mustache",
    "template": "Please wait for the {{result.classifier_id}} to complete training.",
    "output": "str",
    "x": 980,
    "y": 700,
    "wires": [
    [
    "caa94b72.4bd62"
    ]
    ]
    },
    {
    "id": "caa94b72.4bd62",
    "type": "debug",
    "z": "7eeff30a.6e3d1c",
    "name": "",
    "active": true,
    "tosidebar": true,
    "console": false,
    "tostatus": false,
    "complete": "false",
    "x": 1150,
    "y": 700,
    "wires": []
    }
    ]