Blog Posts

More Prototyping (Week 6)

As indicated in the previous post, this week featured the first class session exclusively dedicated to project work. This provided an amazing opportunity for the Road Watch team to collaborate for an extended period of time. We met virtually, via Teams, to discuss project status and exchange ideas. It was a nice change of pace to be able to work on a segment of a project and be able to immediately ask a teammate for their input, instead of having to message them and wait for their reply. Several developments came out of this meeting. First, we began planning a design for a physical LiDAR testbench, which would use extruded aluminum bars to create a track for a scaled-down car. Next, we discussed the status of the Detection Subsystem PCB, which was just ordered, and will hopefully be arriving next week. Finally, several team members configured their local Git repositories, so that they could track their code changes and commit to the team repository on GitHub. Overall, the meeting was very productive. QRB2 is rapidly approaching, so be sure to read next week’s post to follow our preparation efforts.

The stackup of the electronics for the Detection Subsystem. The Detection Subsystem PCB sits on top of the Jetson Nano Development Kit’s header pins. This enables the Jetson Nano to offload IO (input/output) interfacing to a microcontroller on the PCB. The Detection Subsystem PCB will hopefully arrive next week, and when it does a real-life version of this picture will then be created.

Prototyping Efforts (Week 5)

Based on feedback from QRB1, the Road Watch team decided that it is imperative to have a fully working prototype of all non-AI aspects of the project. This essentially means everything must be working in an integrated system, with the exception of the predictive algorithm and the computer vision system. Thus, substantial progress has been made this week towards this goal. For instance, Darrion and Skyler successfully implemented the 1D to 2D LiDAR conversion process, and have all mechanical, electrical, and software components working in tandem to provide a 2D scan of an area. Will also made substantial progress on the Detection Subsystem PCB, and will be ordering it later this week. Billy and Rolando have been busy sourcing many of the electrical components required for our system. Finally, Evan was able to extract vehicle trajectories from the NGSIM dataset, which will be necessary for training the predictive algorithm. Next week, we have been given the whole class period to do more project work, so be sure to check in next Friday to see how our collaborative prototyping efforts pan out.

Some prototyping deliverables from the week. On the left is a vehicle trajectory, which was extracted from the NGSIM dataset. On the right is the fully integrated prototype 2D LiDAR system; the mechanical mount, electrical hardware, and software output are all shown.

QRB1 (Week 4)

This week, the Road Watch team delivered its first Qualification Review (QRB1). As discussed in the previous blog post, the QRB events provide an opportunity for teams to receive constructive feedback and a risk assessment from a panel of experts. Our feedback was mostly positive (which was good to hear), though there was some issues raised. First, the panel though that some elements of our system (presumably our computer vision solution and our predictive algorithm) were ill defined at this point, which is a point that we also raised. Second, they indicated doubt in our ability to satisfy all design specifications with our prototype, probably because of the uncertainty of the viability of our predictive algorithm. Further, the panel recommended that we focus on creating a minimal viable hardware solution before we proceed to optimize the predictive algorithm, and they emphasized that this should happen soon. Thus, we are ramping up efforts in our design work for these hardware deliverables. Be sure to come back next week to see the progress we’ve made.

Some of the design deliverables created for QRB1. On the left is a section view of the 3D mechanical CAD model of our enclosure. The various systems, including the power supply, PCB, LiDAR, and camera array, can be observed. On the right is a 3D view of the detection PCB design, which plugs into the 40-pin header on the Jetson Nano Developer Kit. These will be manufactured and assembled in the coming weeks.

Preparing for QRB1 (Week 3)

This week, the Road Watch prepared for its first Qualification Review Board, or QRB1. At a QRB, our team will present our project to experienced faculty, who will offer feedback on the project. The main objective is to pinpoint risks in the project, and to devise mitigatory efforts. In the context of product development, a risk is an either anticipated or unanticipated factor which might hinder the development of the product. An example of a currently identified risk is that binocular vision’s increasing uncertainty with distance could render the computer vision approach to this project unviable (see the figure below for a more in-depth explanation). In the QRB, our team plans on giving a quick overview of our project, and then each member will present their current progress and discuss associated risks. Check in next week to see what feedback we got for our QRB1.

In binocular vision, two cameras are used to capture the same scene from slightly different perspectives. This allows the calculation of positions of objects in the scene, relative to the cameras. This is how the human visual system works, and offers a sense of depth perception. A drawback of this approach, however, is that the uncertainty in position calculation increases rapidly with distance. This is demonstrated in the figure above, which shows how a position in the scene gets mapped onto the X-axis of each camera’s image (note that the pixel sizes are greatly exaggerated to illustrate this principle). As the position gets further from the camera array (which is centered at (0m,0m)), a larger surrounding area gets mapped to the same pixel, which increases the uncertainty of the measurement. This makes it very challenging, or perhaps even impossible, to measure the positions of distant objects (such as cars on a highway).

In-Class Work (Week 2)

This week, the Road Watch Team had its first in-class work session. These are a new feature of IPPD 2 that were not offered in IPPD 1, as IPPD 2 focuses primarily on the practical realization of the system (which necessitates collaborative in-person work). It was very nice to have all team members physically together in one place, all working on different parts of the system. This provided a good opportunity to quickly exchange ideas between teammates and to catch up on everyone’s progress. Director Latorre indicated that these in-class work sessions would be a bi-weekly occurrence, which is greatly appreciated by the Road Watch Team. Currently, work is distributed between teammates as follows. Darrion is developing the LiDAR driver for the Jetson Nano. Evan is working on identifying predictive machine learning algorithms. Will is evaluating the camera throughput on the Jetson Nano. Rolando is identifying an IMU to assist in aligning the system. Skyler is designing the system enclosure. Finally, Billy is identifying power supplies for the system. Be sure to follow future posts to see how each group member progresses in their tasks.

An IMU, or Inertial Measurement Unit, is a device which is capable measuring motion within a moving frame. Though originally mechanical (think a spinning gyroscope), many modern IMUs are electrical in nature, and use MEMS (Micro-Electro-Mechanical Systems) to measure object motion. Frequently, IMUs report acceleration, rotation, and magnetic orientation (like a compass) in three dimensions, though this varies by IMU. The image above depicts the LSM6DSL (the chip with the green outline), an IMU which is capable of measuring acceleration and rotation in 3D. The axes of acceleration and rotation are shown above. The above PCB is a Robotics PCB which combines the IMU with a motor driver IC, and is part of the μPAD kit developed by Out of the Box for the EEL4744C course at UF.

A New Semester (Week 1)

Welcome to the New Year! In our first week this semester, after a refreshing break, the Road Watch team began with a lecture by Professor Bill McElroy, P.E.. He focused on the topic of credibility, and showed how it was composed of credibility and competence. Then, we began planning out the upcoming semester, by creating a dependency graph and estimating the duration of each major milestone. Our critical path, or longest chronological path through the dependency graph, takes 72 days to complete. It should be noted that this is a hastily constructed draft, and the actual semester may take a massively different course. However, the 72 day critical path should comfortably fit into the spring semester with spare time to make some mistakes. Keep checking back weekly to see how well we stick to this plan.

The draft dependency graph for the semester, which shows the dependencies between various deliverables. It also annotates each deliverable with an approximate duration. Using this information, it is possible to compute that the critical path will take 72 days.

System Level Design Review (Week 15)

As a culmination of our semester of work, the Road Watch team attended the System Level Design Review (SLDR) event to present an overview of our system. The event started out with a networking opportunity with the various Engineering Liaisons, and refreshments. Then, we got to watch a very informative fireside chat between Director Latorre and Quang Tran, a Gainesville-based entrepreneur. After this, we presented our project to our Kyle Bush (our Engineering Liaison), Dr. Rui Guo (our coach), and the PolarFlow Optics, SurgiGuage, and Parrotronix teams. It was very fulfilling to share our project with such a large audience, and to answer their questions and receive their feedback. The event was very polished and professional, and we appreciate all the work the IPPD administration put in to making it happen. As the holidays are approaching, this will be our last post until the New Year. Thank you for following our blog this semester, and be sure to check back in next semester for more weekly updates on the Road Watch team.

The Road Watch Team dressed up for the SLDR event. From left to right: Rolando Angulo, William “Billy” Jones, Darrion Ramos, Evan Andresen, Skyler Levine, and Richard “Will” McCoy

SLDR Presentation Preparation (Week 14)

As the semester comes to a close, we again need to summarize and exhibit our work to our sponsor, FPL. Therefore, we spent this last week drafting our SLDR (System Level Design Review) Report and Presentation. Given the volume of work performed this term, the objective of the report is to provide an efficient summary; anything more technical would be overwhelming for the reader. We wanted to make the presentation similarly concise. Afterwards, we participated in the SLDR Peer Review event, with our peers in the Solar Safe and Crystal Clear Waters teams. In this event, the three teams presented to each other, and each team received constructive feedback. This provides an important opportunity to make improvements before the high-stakes presentation in front of the sponsors. We are very grateful for the feedback Solar Safe and Crystal Clear Waters provided us. Stay tuned next week to hear how our SLDR event and presentation go.

The Problem State slide of our SLDR presentation. With this slide, we wanted to visually show the severity of the problem our product attempts to solve, and to provide an overhead view of the zones it will be protecting.

Prototype Inspection Day (Week 12)

This Tuesday (November 14th), the Road Watch team took part in the Prorotype Inspection Day (PID) Event. Here, each team creates a working prototype of their design, and presents it to judges with relevant experience (and to any other teams who are interested). In some respects, the event was very similar to a science fair. We were able to get the mechanical aspects of the LiDAR, the radio link, the audiovisual alerts, and the computer vision elements of the project working for this prototype, which the judges seemed impressed by. This event was a great opportunity to receive feedback on our design. Most was positive, though some interesting points were raised. Many judges inquired about the computational throughput of the Jetson Nano, as the Computer Vision demonstration was very slow. There were additional recomendations about the enclosure, and the design’s ruggedness more generally. We will be sure to take this feedback into consideration moving forwared. Thank you to all the judges who offered their time and expertise to evaluate and suggest improvements for our project. In the following post, we will begin working on our System Level Design Review (SLDR), so be sure to come back next week to read more about that.

A frame of a video taken of one of our PID presentation (when we get our YouTube channel configured, we will upload the full video). Our hardware deliverables (radio link, audiovisual alerts, mechanical LiDAR aspect, and camera vision on the Jetson Nano) can be seen on the table. The people in the image are, from left to right, Andrew Hale, Dr. Baoyun Ge, Rolando Angulo, William “Billy” Jones, Skyler Levine, and Richard “Will” McCoy. Darrion Ramos was holding the camera at the time, and unfortunately Evan Andresen was not in frame.

More Prototyping (Week 11)

Continuing its efforts from last week, and in preparation for Prototype Inspection Day, the Road Watch team has been busy prototyping. Currently, we have assigned roles as follows Darrion is will be streaming distance data from the LiDAR. Evan is working on developing a flowchart for our predictive algorithm. Will was tasked with streaming video from the cameras on the Jetson Nano. Rolando is creating the radio link used to send alert signals. Skyler is responsible for controlling the stepper motor to spin the mirror for the LiDAR. Lastly, Billy is creating a mock up of the alert system. On Prototype Inspection Day, we hope to demonstrate four (simplified) working subsystems: 1) Distance measurement using the LiDAR 2) Distance measurement using the camera array 3) A spinning mirror (a separate part of the LiDAR system) 4) Activating the alert system over the radio link. We also plan to show high-level flowcharts for our predictive algorithm, for which we are eager to receive feedback. Next week we will certainly have videos to post of our prototype subsystems in action, so be sure to check in to see those.

A picture of the Jetson Nano, taken by a camera connected to the Jetson Nano. The ability to capture photos using the on-board camera represents a necessary prototyping step towards the ultimate goal of processing a video stream in real time.