Do not make the same mistake that I made. I purchased a solitary Spark Core and nearly missed the awesomeness of Spark. You must buy two (2) from Spark.io, Sparks need friend!
A few years ago, I spent many hours with Arduino Uno but did not find it to be that engaging. By default an Arduino Uno lacks networking and it is not a member of the "Internet of Things" unless you add on an appropriate shield or integrate another component. At the time, the ethernet shield was somewhat expensive and Wifi fairly challenging. Fast forward to 2015 and the world has changed dramatically.
The Spark Core is inherently network-aware, in its default state, you must have a Wifi connection even to flash/program it (which does mean it is slower to flash than a USB-connected Arduino). More importantly, your code can expose variables or functions which can be invoked via a REST API call to the Spark Cloud. You can use curl to interact with the physical world - for a Java/JavaScript coder like myself...this is pretty incredible.
In addition, you can engage in publish/subscribe messaging between two or more Spark devices.
My thinking is that an intelligent sensor (based on Spark Core) in one location can take readings and based on need (e.g. stock needs replenishing, conveyor motor offline, temperature is rising), can alert other intelligent actuators to take some action in the physical world.
In this case, I used a Force Sensitive Resistor where its reported value rises based on the amount of pressure (squeeze) you apply and I used the super simple Spark.publish API to send out the event to the Internet. On the other Spark Core, I set up a subscriber that invokes a simple function to "animate" the servo+gripper also from SparkFun. Both Spark Cores are sitting on their respective breadboards (the Core ships with a breadboard) and the subscriber core (the gripper) is being powered by the Spark Battery Shield.
Publisher (with Force Sensitive Resistor) Code
#include "application.h" /* From the article: http://bildr.org/2012/11/force-sensitive-resistor-arduino https://www.dropbox.com/s/dodvjb814uxfscs/2015-03-04%2019.59.21.jpg?dl=0 https://www.dropbox.com/s/rduygfde2irjyv7/2015-03-04%2019.59.34.jpg?dl=0 */ int FSR_Pin = A0; //analog pin 0 int ledPin = 7; bool wasHigh = false; void setup(){ Serial.begin(9600); pinMode(ledPin, OUTPUT); } void loop(){ int fsrReading = analogRead(FSR_Pin); Serial.println(fsrReading); if (fsrReading > 3000) { Serial.println(wasHigh); if (!wasHigh) { Serial.println("sending HIGH"); wasHigh = true; digitalWrite(ledPin, HIGH); Spark.publish("burrsqueeze", "HIGH", 60, PRIVATE); } } else { if (wasHigh) { // the state is normally sub-3000 Serial.println("sending LOW"); digitalWrite(ledPin, LOW); Spark.publish("burrsqueeze", "LOW", 60, PRIVATE); wasHigh = false; } } delay(400); //just here to slow down the output for easier reading }
Subscriber (with Servo+Gripper)
#include "application.h" /* Receives a 'squeeze' HIGH event and closes the servo+gripper to close */ Servo myservo; const bool DEBUG=true; int minPos = 32; int pos = 0; int maxPos = 160; int ledPin = 7; int i = 0; void close_gripper() { if (DEBUG) Serial.println("close"); // close for(pos = maxPos; pos >= minPos; pos-=1) { myservo.write(pos); delay(25); } } void open_gripper() { if (DEBUG) Serial.println("close"); // open for(pos = minPos; pos < maxPos; pos += 1) { myservo.write(pos); delay(25); } } void eventHandler(const char *event, const char *data) { if (DEBUG) Serial.print(event); if (DEBUG) Serial.print(", data: "); if (data) { if (DEBUG) Serial.println(data); // assume LOW unless HIGH arrives if (String(data) == "HIGH") { digitalWrite(ledPin, HIGH); close_gripper(); } else { open_gripper(); digitalWrite(ledPin, LOW); } } else { Serial.println("NULL"); } } void setup() { if (DEBUG) Serial.begin(9600); Spark.subscribe("burrsqueeze", eventHandler, MY_DEVICES); pinMode(ledPin, OUTPUT); myservo.attach(A0); myservo.write(maxPos); } void loop() { delay(400); if (DEBUG) { i++; Serial.print("I'm Alive: "); Serial.println(i); } }Note: I use the Spark Dev (on desktop Atom-based Spark tool) so I include application.h
No comments:
Post a Comment