Monthly Archives: January 2010

Lego battlebot: Bane

*READ DESCRIPTION FIRST!!!*

This is by far my most powerful and destructive battle bot up to date, and is more powerful than any other bot in its own weight class. It features treads driven by XL motors, and a second battery box to power the 300g hammer weapon. The spinning hammers can pulverize nearly anything in its path, and can hit stray lego so hard, they often fly 3 meters.

The only problem with the weapon is that it WILL permanently damage the pieces at the “business end” of the weapon. The 1×8 studded beams are now battered and scratched, and will not be able to returned to normal. Also, the bricks used as targets are damaged, as there are large black scratch marks about them, and there are small chips and pieces missing from them.

PLEASE DO NOT TRY TO RECREATE THIS ROBOT, OR THE WEAPON OF IT! IF YOU BUILD THE WEAPON INCORRECTLY, YOU COULD END UP WITH HEAVY WEIGHT BRICKS FLYING ACROSS THE ROOM, AND POTENTIALLY BREAKING WINDOWS, INJURING PEOPLE, ETC. IF YOU DO RECREATE THIS ROBOT, I AM NOT LIABLE FOR ANY DAMAGE CREATED BY IT.

Duration : 0:3:53

Continue reading Lego battlebot: Bane

Mobile Robot Localization

In this video we can see a localization simulation over OpenGL. The virtual robot is represented as red mark over floor and the blue square represent the theoretic field of vision (the second image on the right hand).
The first image on the right hand is the real field of vision and the two small figures are the summary of real and theoretic fields of vision.
Comparing these small figures we can know where is the real robot… so the localization problem is solved!

Duration : 0:1:29

Continue reading Mobile Robot Localization

Color Constancy for Mobile Robots via the Graphics Rendering Equation

AAAI-10 PGAI Track Submission ID 54

Color Constancy for Mobile Robots via the Graphics Rendering Equation

Real-time object tracking is a basic, yet very important functionality for vision-based robots interacting with humans, indoors and outdoors. Among many descriptive properties of the objects, color is an easily visible, but highly illumination-susceptible property to track. Since an object’s true colors are often not perceived by the camera (due to varying illumination), a form of color constancy, the ability to interpret colors of objects as they are rather than how they are perceived, is useful for vision-based mobile robots. Traditionally, the problem of color constancy has been viewed as a mapping problem from perceived color to true color based on estimated illumination. In this paper, the problem is re-framed as mapping of known illumination and true color to perceived color. Using a generate-and-test methodology, we evaluate which illumination condition leads to a perceived color that most closely matches what the robot actually sees. To generate realistic views of an object under previously unseen conditions, we use lighting simulations established by the computer graphics community. One main contribution of this paper is determining the relationship between the vision community’s color constancy equation and the graphics rendering equation. We then apply such illumination simulations on multi-colored objects to achieve color constancy for vision-based mobile robots. The tracker is fully implemented on a Segway RMP for the task of following a person, and shows good real-time results in illumination-varying scenarios.

Duration : 0:1:46

Continue reading Color Constancy for Mobile Robots via the Graphics Rendering Equation

Color Constancy for Mobile Robots via the Graphics Rendering Equation

AAAI-10 PGAI Track Submission ID 54

Color Constancy for Mobile Robots via the Graphics Rendering Equation

Real-time object tracking is a basic, yet very important functionality for vision-based robots interacting with humans, indoors and outdoors. Among many descriptive properties of the objects, color is an easily visible, but highly illumination-susceptible property to track. Since an object’s true colors are often not perceived by the camera (due to varying illumination), a form of color constancy, the ability to interpret colors of objects as they are rather than how they are perceived, is useful for vision-based mobile robots. Traditionally, the problem of color constancy has been viewed as a mapping problem from perceived color to true color based on estimated illumination. In this paper, the problem is re-framed as mapping of known illumination and true color to perceived color. Using a generate-and-test methodology, we evaluate which illumination condition leads to a perceived color that most closely matches what the robot actually sees. To generate realistic views of an object under previously unseen conditions, we use lighting simulations established by the computer graphics community. One main contribution of this paper is determining the relationship between the vision community’s color constancy equation and the graphics rendering equation. We then apply such illumination simulations on multi-colored objects to achieve color constancy for vision-based mobile robots. The tracker is fully implemented on a Segway RMP for the task of following a person, and shows good real-time results in illumination-varying scenarios.

Duration : 0:1:46

Continue reading Color Constancy for Mobile Robots via the Graphics Rendering Equation