Posts

Showing posts from March, 2020

Obstacle Avoidance

Image
Introduction. The aim of this project is for our car to complete a full lap around the circuit without colliding with any of the obstacles placed along it. We will be using the VFF (Virtual Forces) method. Our only sensors will be an array of lasers located at the front of the car. Although our code will be (again) reactive-method-based, we will require a map to show us the location of the targets we are to reach. Our global locating system will also be constantly providing us with the location of the robot. Despite how similar this exercise might seem to the Follow Line project, their implementations are completely different. For this exercise we are provided with an array of lasers instead of a camera, so there will be no filtering and no colour spaces. Our lasers will measure distance in a specific direction and will return a number. VFF Explained. As I said, we will be using this method to compute both the forward speed and the turn speed of the robot. This method ca...

Cat & Mouse Drones

Image
Introduction. For this project, we are to develop the code (Python) for a drone  – the   cat –  to follow another drone  – the mouse –  which will be flying randomly around de scene. Our drone is equipped with two cameras  – one front and one ventral –  through which it will receive visual input from its surroundings. We don't have access to the Python script for the drone role-playing the mouse, and its movement is apparently random, so we must follow it based solely on our visual input. Just as we did for our previous project, our script will be reactive-control-method based. This means for every iteration of the loop our drone will get an image from a camera, process it and make a decision based on it. Since there are many concepts already covered in the previous entry of this blog I'll try to focus on the new ones. Perception and filtering. What does an image captured by our front camera look like, you may wonder? Well, nothing ...