Since graduating, I have begun graduate studies (still at Georgia Tech), and started working at Georgia Tech Research Institute (GTRI) as a GRA. Even though these courses and work have already begun to consume my life, I still plan on updating this blog with all the interesting things I’m learning about, at and away from the classroom. There are some new stuff available on here, such as a dedicated resume page and updated About section. I also finally bought the domain name (no more “.wordpress.com”).
To kick things off, I’d like to learn how to make my own PCBs. Yes, it’s true that I don’t know how, because even though I got a degree in electrical engineering, my coursework was largely on the theoretic side and with more computer science-y work. I don’t regret it, but I do regret not spending more time “doing”. It’s been tough reconciling that with a new interest in hobby electronics and hardware engineering. I found this: https://www.sparkfun.com/tutorials/108, and I think it will be a great starting point. I’m excited at the prospect of making more professional projects, especially when so many of mine have never gone beyond the breadboard/perfboard.
Not today though, this pipeline with cache simulator won’t write itself.
I know it doesn’t really.
A few weeks ago I got myself this little beauty
It’s a beginner FPGA development board, which I got as a means for myself to learn about FPGAs and VHDL/Verilog, something that my coursework was lacking in, despite my recent growing interest in embedded systems. Since graduation I finally had some downtime to play around with it. For anyone wanting to join me, I’ve been following this site: http://hamsterworks.co.nz/mediawiki/index.php/FPGA_course, and it’s fantastic. The modules are easy to follow, I’ve flown through them and having a blast.
I’ve implemented a simple state machine. This state machine will detect the switches being thrown in the order: 8, 7, 6, 5, where 8 is the leftmost switch. When this happens, the four leftmost LEDs will light. The four rightmost LEDs simply reflect the current state, and it won’t mean much to the viewer without seeing the implementation details.
While cleaning up the semester’s mess, I came across this pamphlet:
I have no idea where it came from, but I looked through it, and found some interesting circuits, such as this one:
It’s a “random number generator”. There are two main components: an oscillator circuit and a decade counter, pretty much split down the middle. About each one…
This involves the two NAND gates on the left. If you’d like, you can replace then with inverters, however, because tying the inputs of a NAND gate together produces the same function as a NOT (check: 1 NAND 1 = 0, 0 NAND 0 = 1).
Putting inverters together in a circuit with feedback creates an oscillator, like so:
And depending on the value of the resistor and capacitor, you can make it oscillate at different rates.
“Decade” means 10, so a decade counter counts to 10. The specific IC used here (4017) has 10 output pins that are driven high depending on the current count, instead of 4 pins that represent 0-9 in binary. Counts are advanced by a clock, so the oscillator output circuit connects to the decade counter at the clock pin, and the circuits are separated by a pushbutton.
To see what pin is being driven high, and hence see the current count, you can put a LED in series with each output pin, and the LED will light according to the count.
When the pushbutton is activated, the decade counter sees the oscillator output and will begin counting. If the oscillator is operating fast enough, seemingly “random” numbers can be produced by letting go of the pushbutton.
Because I’m on break and I just happen to have the components and ICs on hand, I built it.
I’ve been pretty good about not turning this into a personal blog, rather than a professional one, but here’s some pertinent personal news: as of Saturday, Dec. 14, 2013, yours truly is a graduate of the Georgia Institute of Technology with not one, but TWO degrees (Electrical Engineering/Math)! What does this mean? Unfortunately the excitement of the culmination of the last 4.5 years quickly washed away when I realized I’m still committed to graduate school. But this is still a major milestone that I’m proud to share. Woohoo!
They don’t give out the diplomas as you walk the stage though (things that could have been brought to my attentION YESTERDAY). I was handed a roll of paper, which I promptly unrolled as I returned to my seat.
I’ve since decided it’s rather appropriate.
Whoa, so very much thanks to Hack a Day for featuring my team’s senior design! The link is here: http://hackaday.com/2013/12/10/a-kinect-controlled-robotic-hand/
In this post I will detail what my group (Cameron Reid, Carlton Beatty, Chris Stubel, and myself, Ben Yeh) have been on since September. Allow me to present, my senior design:
Robotic Arm with Kinect Interface
To put it simply, it’s a robot arm that copies you. Think motion capture, but the results you see in real time on hardware.
Isn’t she a ‘beaut? Now if I had just said “robotic arm”, I’d think the the typical reaction would be, “ppfftt, what’s so special about that?” Two things makes this stand out.
The first is the end affector. We did away with the standard open and close claw, and attempted to fashion something that looks and moves like the human hand. Admittedly, functionality wise, it’s not so different from the claw in that the hand simply opens and closes, but it’s a step towards a dextrous hand in the future.
Second is how it’s controlled: via the Kinect and a custom sensor-glove. With this, the arm copies YOU in (almost) real time. Let me show you this thing in action:
As advertised, observe the arm is roughly going through the exact same motion as me.
I did not build this thing in mind with any one specific application. What I wanted was an intuitive interface for controlling a robot. This robot could then be sent out into situations that humans can’t or shouldn’t go. Imagine, if I could put some wheels on this bad boy to make it mobile, and I had another one for the other arm, this thing might be able to administer disaster relief, bomb disposal, become a field medic, collecting samples in deep sea diving, firefighting, etc..
It’s also my first foray into the field of robotics, however cursory it was. I did not focus my coursework on robotics or controls at all. In addition, just about everything was built (read: hacked) from scratch.
If nothing else, I had a lot of fun building this thing, and I think it’s really freaking cool. Here’s a video of the group messing around:
HOW IT WORKS
At the highest level the system consists of: a Kinect, a PC, a microcontroller, and the rig (arm). Here’s a component diagram:
A user stands in front of the Kinect which does some black magic to track the user’s arm’s position (just the arm though, up to the wrist). This is in conjunction with the glove with tracks the user’s hands. The glove hardware is simple enough to be processed by one uC (an Arduino UNO in this case), but the Kinect’s sensor information is processed by a program (written in visual C++), then sent to another microcontroller (also an Arduino). The Arduinos then actuate the motors according the to data that was received.
Let’s go into more detail about how it works. This will be broken down into hardware (construction of the hand, arm, and glove) and software (using the Kinect, uC programming), and some misc.
Building the arm
Having very little resources, and no knowledge about commercial robot platforms (which would most likely be out of budget anyway), we set out to create out our project out of something very minimalistic. What we found was the AL5D by Lynxmotion (http://www.lynxmotion.com/c-130-al5d.aspx).
That doesn’t look very human-like, so we constructed a chassis and slapped it upside down to the top, which looked like the right picture in the beginning stages. Let’s zoom in each joint.
This is the shoulder. There two two degrees of freedoms afforded here. The first I will call the “base”, and it swivels about the vertical axis, controlled by one servo motor (hidden inside the case that’s connected to the top). The second I will call the “shoulder” (terrible naming, these two movable parts I also collectively call the shoulder), which rotates about an axis perpendicular to the vertical axis, and also controlled by another servo.
This is the elbow. It bends just like the human elbow does, and is controlled by another servo.
This is the wrist. It rotates about the axis made by your forearm (it moves like the Queen’s wave). And you guess it, it’s controlled by another servo. You might wonder if it can bend, and the answer’s unfortunately no. Tracking the wrist and hand proved to be difficult, so we did away with this movement possibility.
All together, the arm (minus the hand) provides 4DOF, so it’s not quite able to mimic a human arm fully.
Building the hand
The hardware for the arm was about $150, but this nifty “hand” will run you just around $25 and your afternoon. This is actually a pretty popular design for the hobbyist’s animatronic hands. What you need is something with some natural tension/springiness. We found these things called toggle bolts, they look like this
They’re made to affix things into walls. The flap is spring loaded, so that it can be closed and fed through holes and expand on the other side. We’re just interested in the flap part. If you overlay a few of these together, and introduce some tubing, you can make yourself a “finger”
By tying a string to one end, and pulling on the other, it will bend in a way very similar to a real finger. Put a couple of these fingers together and you’ll almost have yourself a hand. Observe:
What separates us from the animals? Is it our thumbs? The thumb placement was nothing short of tricky. We eventually settled on placing it directly opposite the rest of the fingers, even though a real thumb is not this restrictive. It still ended up working well enough.
For reasons that will be explained later, we built a glove to control the hand portion of our arm.
Just like the hand, this is also a fairly common maker’s project. The yellow stripes on the fingers are flex sensors, which are essentially potentiometers that react to bending. One way to use this is to make a voltage divider with these and feed the output to an ADC so as to monitor how you are bending your fingers. The red breakout board is an accelerometer. This is used to detect the tilt of the wrist. Depending on the orientation of your wrist, the force of gravity breaks down into vary components along the standard coordinate axis, like so
For now, this is all wired to the Arduino through a cable that connects to the perfboard circuit on the back of the hand.
Utilizing the Kinect
Even more so than the entire team’s lack of mechanical engineering and robotic knowledge, the whole project would have been a lot harder without the Kinect. I had played with the Kinect before so I knew that it was capable of skeletal tracking. By skeletal tracking, it means we are able to leverage the Kinect’s various sensors, and through programming with the SDK provided by Microsoft, to produce such a representation of the human body:
In addition to planar coordinates, the Kinect features a few depth sensors, so that it is actually able to provide coordinates of each joint IN SPACE. Moreover, all this requires is a few lines of code in the SDK. Then what we do is we extend vectors between say, the elbow and shoulder (call this A), the elbow and wrist (call this B), and use the relationship A dot B = |A||B|cos(theta) to get the angle between them. To get other angles, it remains to find the appropriate reference angles.
When it gets an angle, it then sends it over serial to an Arduino.
Let’s consider Arduino one in the block diagram in the beginning. All this is responsible for is reading the various sensors on the glove and actuating the motors for the hand accordingly. For example, when I fully flex my index finger, there is a software mapping of this reading into the angle that the motor connected to the index finger is supposed to turn to. The accelerometer is an I2C device, and we similarly extract the acceleration magnitude in each coordinate axis, and use some trig to get the wrist tilt angle.
The second Arduino doesn’t have a terribly complex job either. Its job is the move the 4 motors controlling the arm. It is connected directly to the computer running the Kinect skeletal tracking program. It waits for the bytes that contain the angles for the arm, then moves the motors accordingly.
Accessories: a lot of the immediate applications for this project involve sending it out into situations not suitable for humans, such as disaster relief and battlefields. We thought that it would be neat if this thing could take a pulse. Turns out, there is such a product for this (http://pulsesensor.myshopify.com/pages/code-and-guide). It’s a light sensor that with some programming, can be used to detect pulses. We strapped one of these onto the end of the index on the mechanical hand, and attempted to have a user direct the arm to a pulse location.
It…could use some work. But it opens up the possibility of incorporating sensors into the project so that it could interact with the environment more effectively.
Well that’s it. I hope you liked the project. I certainly enjoyed making it, and learned so much in the process. The entire thing will be documented more heavily as required by the Senior Design course, and we will include all the code, schematics, etc in order to replicate the project. For now, I’ll just leave the code here:
…oh and what’s this!?
For the uninformed, the Inventure Prize (http://www.gpb.org/inventure-2013) is a competition hosted at my school, Georgia Tech, in which groups enter projects and compete for prizes such as cash, and a patent drawn up free of charge. I had no originally planned to enter, but I was presented with this golden ticket that guarantees entry starting with the semifinals. How about that?
Thanks for reading!
Hello! I’m back with another, but final update on my senior design project. This is probably as finished as the project will be, except save for some cosmetic changes and parameter fixes. I’m very proud of what it can do, and how it looks, check it out below:
For our demo, we will attempt to have the arm move/manipulate some simple objects. We will have a booth set up at the Georgia Tech Capstone Design Expo where we will demonstrate our project amongst all the other senior designs, and I am confident that we will stand out.
When this is all said and done, I will post a detail write up of the project and make the design and code available for whomever wants to continue or modify the project.