United Kingdom Project Notice - Future Colour Imaging


Project Notice

PNR 30780
Project Name Future Colour Imaging
Project Detail Colour Imaging is part of every day life. Whether we watch TV, browse content on our tablets or phones or use apps and software in our work the content we see on our screens is the result of decades of colour & imaging research. In the future, the challenge is to understand more about the content images. As an example, in autonomous driving we wish to build a platform that sees the road independent of the atmospheric conditions, we dont want to crash when we are driving in fog. It is well known that an image that records the near-infrared signal is much sharper (compared to RGB) in foggy conditions. What is near infrared? The visible spectrum has a natural rainbow order: Violet, Indigo, Blue, Green, Yellow Orange and Red. Infrared is the next colour after red that we cant quite see. Image fusion can be used to map the RGB+NIR signal to a fused RGB counterpart, that we can see. Through image fusion the same detail will be present in foggy or non-foggy conditions. Advantageously, Image Fusion is a tool that will allow non visible information to be incorporated and deployed in existing RGB-based AI scene interpretation systems with minimal retraining. Our project begins with the Spectral Edge Image fusion method, the current leading technique. This method - and most image fusion algorithms - works by combining edges from the 4 images (RGB+NIR) to make a fused RGB-only 3-channel edge map. The edges are then transformed (the technical term is reintegrated) back to form a colour image. Unfortunately, and necessarily, the reintegrated images often have defects such as bright halos round edges or smearing. We argue that the defects are a direct consequence of how edges are defined. In our research we will - based on a surprising mathematical insight - develop a new definition of edge, quite a bold thing to do after 50 years of image processing research! By construction the reintegrated new edges will have much less halo and smearing artefacts. We will then use our improved edge representation and improved image fusion algorithm to make better looking images. These might be the fused images themselves: wouldnt it be great to have smart binoculars that allow us to see more detail in images when it is rainy or a landscape that is blurred by distance. However, we also believe the future of photography, in general, is content-based and that image fusion will help us determine the content in an image. As an example, when we take a picture at sunset, the shadows in the scene are very blue. But, outside of the shadow the light is very warm (orangish). The best image reproductions for these scenes involves manually and differentially processing shadow and non shadow regions. Here, we seek to find the illumination content in image automatically. Then in a second step we will develop a new content-based framework for manipulating images so that, for this sunset example, we dont need to edit the photos ourselves. In complementary work, we are also interested in helping people see better. Indeed, there is a lot of research that demonstrates that coloured filters can help mitigate visual stress. Coloured filters are used in Dyslexia (sometimes leading to dramatic improvements in reading speed) and there is now blue absorbing glass which will reduces the blue light coming from a tablet display (since blue light at night tends to keep you awake). Much of the prior art in this area is direct. We find a filter to directly impact on how we see (simply, if we put a yellow filter in front of the eye then everything looks more yellow). Our idea is to deign filters that are related to the tasks we need to solve. For the problem of matching colours we will design filters so that if you suffer from colour-blindness you will be able to colour match as if you had normal colour vision. We will also develop indirect solutions for the blue light problem and visual stress.
Funded By Self-Funded
Sector Painting
Country United Kingdom , Western Europe
Project Value GBP 1,046,725

Contact Information

Company Name University of East Anglia
Web Site https://gow.epsrc.ukri.org/NGBOViewGrant.aspx?GrantRef=EP/S028730/1

Tell us about your Product / Services,
We will Find Tenders for you