Monday, November 11, 2013

How NASA fixed a computer on another planet

I just love it when something that sounds like science fiction is actually true. In 1997 the Mars Pathfinder landed on Mars but after a few days it started having trouble: the computer was "crashing" and resetting, preventing the pathfinder from finishing its daily tasks, which it resumed until the next day.

If you are interested on the technical details, you may find them HERE, in the words of someone who actually was involved in the mission. If you are curious but want a quick and simple explanation, I'll try to provide it here the way I (as a noob) understood it.

Super quick overview of the system:
The Pathfinder was controlled by a Real Time Operating System (RTOS), a software that has as one of its main characteristics the ability to schedule the tasks that will run in order of priority and even "pause" them if something with high priority comes along.  

The CPU was connected to a Bus, which is a sort of "pathway" shared by different devices, one of them called ASI/MET, used for gathering meteorological data. The access to this Bus was controlled by the RTOS in two tasks with the highest priority in the entire system.

What went wrong:
One of the fun things of RTOSs is that sometimes a variable is used by two different tasks but you have to make sure only one of them is changing it at a time. You can protect the access to this variables using yet another variable called "semaphore" that a task can "take" when it's going to access the variable and "release" when it's done with it. If a task needs access to a variable but the semaphore is "in the hands" of another task, the task is put on hold until it can take it. 

Now, imagine that a low priority task takes a semaphore but then it gets "paused" before it can release it because the RTOS needs to run a higher priority task. And then other higher priority tasks keep getting in the way so that the poor little task that has the semaphore never gets a chance to continue. Finally, imagine that a super important task has to run but it needs the semaphore... yeah, that's a problem.

In a nutshell, that's what was causing the problem with the Pathfinder. The low priority task that controlled the ASI/MET took (and didn't get the chance to return) a semaphore that was also needed by one of the high priority tasks that controlled the Bus, preventing it from completing. When the other task that controlled the Bus tried to run, it saw that the first one didn't complete, it figured something was terribly wrong and declared the error that started the reset of the entire system.

How it was fixed:
Once NASA figured out what the problem was, after carefully making sure that wouldn't break something else, they enabled a feature in the RTOS called "Priority Inheritance". This feature pretty much allows the priority of a task to be "bumped up" if it's holding a semaphore that a higher priority task needs. To apply the change they didn't transmit to the Pathfinder an entire new copy of the software, they just sent the pieces that needed to change and then the system "patched" itself. 

And that was it... easy... right?

Thursday, October 31, 2013

Scratch that!

Yeah, I thought I was going to write more often but then  got a job. That was 8 months ago... Last week, a possibility of a better job came along. Why is that relevant to this blog? Well, I'm still waiting to hear from my potential new employers and while I was reflecting on my interviews I realized that my coding style left much(really, a lot) to be desired. I came to the conclusion that even if I don't get that position at least I got the desire to do things the right way, starting with the readability of my little projects.

So, whatever little code you may find here that was posted before today, please excuse the horrible style. I will try to do better from now on.

EDIT: One of the things that helped me prepare for my interview was an article called "Twenty-Five Most Common Mistakes with Real-Time Software Development" by David B. Stewart, PhD. I liked it a lot and I feel that it applies to other types of development, not just Embedded Systems. You may find the article HERE. Just in case, I have also kept a copy HERE, I hope that doesn't bother anyone... 

Sunday, May 12, 2013

A few more notes about 7-Segments display and IR receiver on Teensy

You may have noticed that I hooked up the Teensy board straight to the 7-segments display... That was probably not the best practice for a couple reasons: 1) It would probably be better to use the board to drive the display through transistors or a controller IC instead of using it to actually power it. 2) That's a lot of pins used. This was just an exercise to practice programming on Teensy.

Just as a reminder, here's a picture of it and it uses the NEC protocol.



And before I forget, here's the list of decoded IR commands from my remote.

Vol- FD00FF Play/Pause FD807F Vol+ FD40BF
Setup FD20DF Up FDA05F Stop/Mode FD609F
Left FD10EF Enter/Save FD906F Right FD50AF
0/10+ FD30CF Down FDB04F Back FD708F
1 FD08F7 2 FD8877 3   FD48B7
4 FD28D7 5 FDA857 6 FD6897
7 FD18E7 8 FD9867 9 FD58A7



Anyway, if I ever complete my Space Invaders in Teensy and Spartan project, I won't be using this display, I will most likely use the display on the Spartan board.

Monday, February 25, 2013

IR Receiver on Teensy

For the past few days I've worked on the first building blocks of my Space Invaders project. As of right now, the approach I've decided to take is to have the Teensy board take the inputs from an infrared remote and control most of the game logic there and then send some display information, like the position of the different objects on the screen to the Spartan-3 board (maybe via Serial port) and have the Spartan-3 do all the display work through its VGA port.

I have been working in the VGA controller and video memory on the Spartan-3 (more on that later) but I took a break from it to work on the controller for the game. Basically, there is only need for 3 buttons: left, right and fire. Maybe I'll add extra functionality after the basic part is done and use some other buttons, but for now 3 is all I need.

This was my first time using the Arduino environment to program Teensy(or anything), which is compatible with it as well as with C. I was glad to find the many libraries at my disposal. I started with Ken Shirriff's IRremote library and added my 7-segments display with slightly different code than the one I used last time to make its timing more independent from the rest of the program. I also used a piece of an old Ethernet cable for the wiring to make it nicer and avoid the mess from last time.


My IR remote uses the NEC protocol and Shirriff's library worked like a charm, I only added a few lines of code to account for held-down keys so the code for the key pressed keeps repeating instead of changing to the special repeat code. The above picture is of the 7-segments display showing the last 2 bytes of the decoded result, which in the case of my remote, are the only ones that change.

As I mentioned, this is just part of the Space Invaders project, so the code for this sill be available on GitHub HERE. The 7-Segments display files can be reused as a basic library, just keep in mind that the pin assignments are defined in the .h file.



Thursday, February 21, 2013

What's next?

Remember all that hardware I got last year? Well, I finally have some time to start playing with it! I still don't have any original project to work on, therefore, in the meantime I will be doing an adaptation of one of my old school  labs to practice, focusing especially in my coding style because I am well aware of my deficiency in that area.

A couple years ago, for my embedded systems class, we replicated the space invaders game on a Xilinx Virtex-II Pro board. Sadly, I didn't catch in video the finished product and the only thing I could find was this  video of me testing part of it:


So, for my practice project I think I'm going to do the same thing but this time I'll use my Teensy++ and Spartan-3, which is by far less powerful than the Virtex-II, and adapt some of my old VHDL code to build everything again from scratch. As of right now I haven't really considered the limitations of my current hardware now that I won't have a "fancy" PowerPC processor and 2GB of RAM, but the plan is to program a part of the game on each device and get them to talk to each other. This way I get to practice on both of them and also learn something in the process of making them interact. It should be doable... right? ....right?

Anyway, once I get started I'll post my code on GitHub for everyone to see, just keep in mind that it will be a  long-term, always work in progress thing for learning and practice purpose only.






Pey! Where have you been?

Wow, it's been almost a year since my last post. I had to set aside my little projects because 2012 was my last year in college and since I left the best classes for last, I had to focus on them. My last semester was especially interesting, as I did my senior project and learned a little about android development, something that I hadn't even touched before. In a nutshell, what my team and I did was develop an android app for a Galaxy Nexus to drive a little toy truck using its GPS, compass and camera receiving target locations from another android app developed for a Nexus 7 tablet via the internet.

The tablet app had a map to select destination points and a couple testing functions as well as a D-Pad for manual override. It looked like this:


We used OpenCV to make the camera look specifically for orange cones and avoid them. Here's a video of the last test we ran of the completed project (I'm the guy in the red jacket):


I admit there was some room from improvement here and there, but given the time we had to do it and the fact that we really didn't know anything about android development or computer vision when we started I think we did great. And with this I said good bye to college and I am officially an Electrical Engineer! Yay!