Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 4.0

Abstract 

In this project, added the vision ability to our robot in order to give it more functionality and the ability to recognize more within its space and respond to them appropriately. We did this by integrating the SVM (Social Visual Module) into our system. To send and receive messages from the Module, we used the IPC (Inter-Process Communication) system that gave us back data based on an actual response in our visual module to the set operators. We then implement various programs to turn on various operators and broadcast the data received using the SVM.

Task Description

Our first task was to run the listener program. The next was to implement a program that connect to central, commands the SVM to turn on the pink badge and then print out any detections. The next task was to try to connect to the IPC when our navigation program started and call IPC_listenClear() in the main loop.Our last task was to add a new state to our system PINK_TRACK that basically turned on the pink badge detector and approach the detections at a given distance. 

Solution

The first task was completed by just running the program in the terminal and having the central program running and the appropriate programs running in the terminals. The second task was implemented by taking out a major section from the listener.c file and only implementing the pink Badge detector. The third task was implemented inside our wall following program. The final task was also implemented within the wall-following program and the PINK_TRACK was called after every 30 seconds irregardless of what state the robot was in. Once inside the PINK_TRACK state, I used the dataHandler function to store the data in a reference to the Operator_data struct and use the values from that structure to move the robot either forward, left or right.

Conclusion

This was a very important project because it gave the robot the ability to detect with more accuracy things within its environment. In short, we made the robot smarter and perform more meaningful actions within its programming.