The standard colour sensor block in EV3Lab unfortunately does not have the functionality to give you raw values for RGB values (Red, Green and Blue). This is a problem if you want to calibrate your colour sensors properly, i.e. look at the raw values of each component, and from these values derive if the sensor is seeing a specific colour you're interested in.
Fortunately, you can install a 3rd party sensor which gives you access to all these values. Please see EV3 Color Sensor RGB Block Enhanced – OFDL Robotics Lab Taiwan. Note: although third party plug-ins were not allowed previously in FLL competitions, from 2020 that restriction was lifted where any software can now be used to program the robot.
If you have multiple colour sensors, please use the same generation of colour sensors, because the RGB values can differ a lot between different generations for the same colour. You need to look at the code on the sensor itself to determine the generation. The code is in format 99N9, where the first 99 is the week number and the last 9 is the year number (from 2000).
Our menu program has a section where it reads the RGB values from our 2 colour sensors and display them on-screen. We limit it to about 2 readings per second so you can properly see the values on screen. We use this to ensure the colour sensors are working properly, as well as to get values in order to calibrate our White / Black detection algorithms.
See below for an implementation we're using to identify if the colour is white or black. We pass in the colour sensor port number because we might be interested in different colour sensors at different times in our program. It outputs two logical values: one for black and one for white.
In the implementation above for white we need a Red component of at least 100, Green component of at least 100, and Blue component of at least 50. Similarly for black we need a Red component less or equal to 50, Green component less or equal to 50, and Blue component less or equal to 30. Just note these values are for the specific generation of sensor we're using (xxN3) plus the height of the sensor above the map. Don't rely on these values but rather calibrate your colour sensors properly.
Once you have this component, it makes it easy to implement logic to move forward until it hits a white or black line. We have 2 sensors and we require both sensors to see the same colour. This also allows the robot to align itself against the line. We get the motor speed as input, and will move both motors independently with this speed until it hits the color on the specific sensor (sensors 2 and 3 in the implementation below). Note that you need to stop the motor as soon as the colour is reached, otherwise it won't align properly.
Move to white:
Move to black:
For proper line squaring you might need to do a couple of iterations - first move until you see white, then at a slower speed move until you see black, and even slower to move back to white. This is called line squaring and is a good technique to limit the uncertainty of your gyro sensor (see the section on gyro sensor for more information). Note that if you can both find a line in the X-direction as well as in the Y-direction it allows you to consistently position your robot with high precision on the mat.
Comments