Showing posts with label research. Show all posts
Showing posts with label research. Show all posts

Monday, 4 August 2025

Those infrared contact lenses

I think it's about time I wrote about this story, as it has made the newspapers, including a good piece in the Guardian.

Basically, this follows on from something I have noted before, back in March 2019, whereby Chinese and American researchers extended the range of rodent vision by injecting nanoparticles into their eyes. Now, a team including some of the original researchers have put the nanoparticles into contact lenses which can achieve a similar effect in humans, who can report on their experience better than the mice.

I think it's important to note that the nanoparticles absorb photons with wavelengths in the range 800 to 1600 nm (a little more than the range of IR film or digital sensors) and emit it in the range 400 to 700 (visible light), mostly around 550 and 650. So this is not thermal imaging and it won't enable you to see in the dark (without suitable IR illumination). Also, because the partlcles absorb and then emit, two things happen. One is that the upconversion takes some energy, so the intensity will be reduced. The other is that this doesn't appear likely to give you a sharp image but whether the result looks like the old HIE halation or is basically foggy isn't clear. My guess is the latter. So you could detect an IR source but it wouldn't look like an IR photograph. However, the team say that using their material in glasses makes things sharper, so I may be misunderstanding the process.

Another bonus, which the news coverage picked up on, is that the IR passes easily through thin skin, such as your eyelids, so you can see the results even with your eyes closed.

Different input wavelengths can be shifted to different output wavelengths. This isn't like colour infrared film but is actually more interesting and nuanced, with the examples reported by the Guardian being 980nm shifted to blue, 808nm to green and 1532nm to red.

I look forward to finding out more.

[Original paper: www.cell.com/cell/abstract/S0092-8674(25)00454-4]

Monday, 10 July 2023

Infrared generative AI

I've been 'playing around' with a couple of generative artificial intelligence engines that carry out text-to-image. I was interested in how they understood the look of an infrared photograph.

This was partly because I find that at least one of my IR photos has found its way into the models used for generative AI. Without permission of course!

I tried a fairly simple shot; that of a tree. A good test for the basic Wood Effect. The results are quite good and, given how unreal IR photos can look anyway, it would be difficult to tell from the real thing. Sometimes the background detail gives things away but I found I could quickly get good results.

The two I tried are Stable Diffusion and Adobe Firefly (chosen simply because I have access to them) and the text prompt is the same for each: Infrared image of an oak tree in a meadow.

This is the Firefly result ...

As you can see, I got a false colour image without specifying, and in this case it's not a bad result. Often the system would put arbitrary colour washes over the image, but here the result is as you'd expect.

This is the Stable Diffusion result ...

Here, it's black & white straight away. Again, a pretty good result.

I did also try to generate some thermal images but neither engine could get anywhere near what you'd expect, presumably because there are not many, if any, thermal images in the training set.

Monday, 28 June 2021

Through a glass not-so-darkly ... infrared to visible up-conversion

I've been familiar with the phenomenon whereby two waveforms can interact such that a third and fourth, the sum and difference of the two wavelengths, are generated. This has been used in radio transmission and reception for decades, known as heterodyning. [See the Wikipedia page.]

I now discover that there is a similar phenomenon at light wavelengths which can result in up-conversion allowing near infrared (AKA short wavelength infrared - SWIR) to become visible. The earliest reference would appear to be in a 1967 paper by JE Midwinter and J Warner [Up‐Conversion of Near Infrared to Visible Radiation in Lithium‐meta‐Niobate Journal of Applied Physics 38, 519].

The abstract is as follows ...

Single‐crystal lithium niobate pumped with pulsed ruby‐laser radiation has been used to convert 1.7‐μ radiation to green light with more than 1% efficiency. A narrow infrared bandwidth of 17 Å, set by the phase‐matching requirement only, allows the up‐converter and photomultiplier to operate in place of a monochromator and infrared detector, and the emission spectrum of a mercury lamp has been thus examined in the region of 1.7 μ. A close agreement between theory and practice has been found in all respects except noise performance. Further studies of this aspect are required.

Moving on to mid-June 2021 and we can see that 'further studies' have indeed been done. (This is not so say that Midwinter et al have not been hard at work; I am coming into this rather late.) The NanoWerk web site has published an article entitled Let there be light! New tech allows people to see in the dark. [Link to the article here.] It outlines how a team at the Australian National University (ANU), working with an international team, have prototyped a device, based on nano-technology and metamaterials, which up-converts near infrared to visible wavelengths.

The lead researcher Dr Rocio Camacho Morales, is quoted in the article saying

We’ve made a very thin film, consisting of nanoscale crystals, hundreds of times thinner than a human hair, that can be directly applied to glasses and acts as a filter, allowing you to see in the darkness of the night.

Fortunately, their paper describing the work is openly available online. The title is Infrared upconversion imaging in nonlinear metasurfaces and describes the technique as follows:

In this approach, the IR image is not directly detected; instead, a parametric nonlinear optical process is employed to convert the image to higher frequencies and detect it using regular cameras in a process known as upconversion IR imaging.

So basically what we have here, albeit in rudimentary form, is a piece of 'glass' that shifts the wavelength of radiation passing through from near infrared to visible light without needing cooling or even imaging technology. There are nanoscale antennas on a gallium arsenide wafer, tuned to the relevant wavelengths and what is described as a pump laser beam to interact with the incoming signal (both near infrared). The frequency of the derived waveform is the sum of the target image frequency and the pump laser beam. This process is also very fast, described as having 'femtosecond temporal resolution' which could enable 'ultrafast imaging of chemical reactions in a conventional microscope device', never mind the opportunities for inexpensive imaging of near infrared. I assume it would work at thermal wavelengths with the right wavelength of pump laser.

Check out the paper and see what you think. I'm fairly excited by the possibilities, even if it will presumably take the team a while to get it to photographic resolutions.

I will finish with their abstract, followed by the citation and link.

Infrared imaging is a crucial technique in a multitude of applications, including night vision, autonomous vehicle navigation, optical tomography, and food quality control. Conventional infrared imaging technologies, however, require the use of materials such as narrow bandgap semiconductors, which are sensitive to thermal noise and often require cryogenic cooling. We demonstrate a compact all-optical alternative to perform infrared imaging in a metasurface composed of GaAs semiconductor nanoantennas, using a nonlinear wave-mixing process. We experimentally show the upconversion of short-wave infrared wavelengths via the coherent parametric process of sum-frequency generation. In this process, an infrared image of a target is mixed inside the metasurface with a strong pump beam, translating the image from the infrared to the visible in a nanoscale ultrathin imaging device. Our results open up new opportunities for the development of compact infrared imaging devices with applications in infrared vision and life sciences.

[Rocio Camacho-Morales, Davide Rocco, Lei Xu, Valerio Flavio Gili, Nikolay Dimitrov, Lyubomir Stoyanov, Zhonghua Ma, Andrei Komar, Mykhaylo Lysevych, Fouad Karouta, Alexander A. Dreischuh, Hark Hoe H. Tan, Giuseppe Leo, Costantino De Angelis, Chennupati Jagadish, Andrey E. Miroshnichenko, Mohsen Rahmani, Dragomir N. Neshev, "Infrared upconversion imaging in nonlinear metasurfaces," Adv. Photon. 3(3) 036002 (14 June 2021) ]

doi.org/10.1117/1.AP.3.3.036002.

Tuesday, 9 July 2019

Solar panels and 'reflective' grass

I was interested to read in today's Guardian that BP are researching the most reflective kinds of grass to plant underneath solar panels; panels that pick up on top and underneath. The story points out that "bifacial panels can increase electricity output by almost 15% – but this can be much higher if the ground beneath the panel is particularly reflective".

Although the story doesn't mention it, regular readers and fans of infrared photography will know that grass (as well as other foliage) strongly reflects near-infrared light due to the retro-reflective effect of those wavelengths travelling through plant cells. Chlorophyl is transparent to near-infrared radiation. The effect is the same one that makes snow appear white, which is why infrared photos and snowy scenes can be confused.

The Solarquotes blog in 2017 looked at the proportion of solar radiation that a solar panel can exploit. Their context was about UV but if you scroll down the page you'll see a diagram that shows that a silicon solar panel will make use of radiation between 400 and 1100 nanometres. Since visible light extends from about 400 nm to about 650 nm you can see that including near-infrared more than doubles the available energy bandwidth.


All this makes planting grass underneath bi-directional solar cells a logical thing to do. That reflective grass is not just fun for infrared photography then ... or grazing!

Wednesday, 13 March 2019

Injections can give you near-infrared vision

Scientists in China and the USA have developed a technique to extend the range of vision by injecting nanoparticles behind the retina (so without any external technology). These particles bind to the photoreceptors in the eye and can convert low energy (near-infrared) photons to a higher energy (green). This has been demonstrated to allow mice to add near-infrared to their range of visible 'colours' with no side-effects.

I first came across this in New Scientist: Mice given ‘night vision’ by injecting nanoparticles into their eyes ... but the full paper is also available online via Cell.com.

One interesting thing about this is that a transfer of energy also occurs in photosynthesis, between types of chlorophyl, and in sensitising film to accept longer-wavelengths.

[Ma et al., Mammalian Near-Infrared Image Vision through Injectable and Self-Powered Retinal Nanoanten- nae, Cell (2019), https://doi.org/10.1016/j.cell.2019.01.038]

Monday, 5 June 2017

Biggles shoots infrared photographs over Mount Everest

There has been news coverage recently of the digitisation and publication of films from the archive of the Royal Geographical Society, and BBC reporter Pallab Ghosh narrated a film on the BBC News channel about some of them. One such film comes within our purview, being a record of the 1933 Houston Expedition which flew biplanes over the Everest range. They shot movie footage and stills, including some infrared plates provided by Olaf Bloch at Ilford.

In the early 1930s, infrared photography was something of a popular sensation and from about 1932 newspapers regularly printed large IR photographs demonstrating the ability to penetrate atmospheric haze and achieve extremely long distance views.

Everest summit and Chamlang taken from 100 miles away
The published book documenting the expedition, 'First over Everest', goes into some detail about the infrared setup. Plates came from Ilford, and by this time the sensitivity of infrared plates was such as to allow exposures 'as rapid as one-sixtieth' of a second. Taylor, Taylor & Hobson loaned a lens with an aperture of 4.5 and a focal length of 25 inches (635mm). The camera 'was a somewhat rough and ready improvisation made of plywood' which was sourced with the help of The Times newspaper and its legendary art editor Ulric Van den Bogaerde (father of actor Dirk Bogarde). There was, presumably, a quid pro quo because the Times had first publication of images from the expedition. On May 8th 1933, almost exactly a month after the flight, the Times was able to publish the expedition's most famous photograph, showing the summit of Makalu, the fifth-highest mountain in the world and 22 km east of Mount Everest itself, rising above cloud from a distance of over 100 miles.

The camera was very big and heavy and required special mounting in the plane. It was three feet long by a foot square and was placed under the fuselage, hanging in vibration-proof mountings where this aircraft was designed to carry torpedoes, and set up pointing forwards so that the field of view did not include the bottom cylinder of the engine. Plates were changed through a hatchway in the floor of the observer's cockpit. The observer had to hang upside down and put his head and hands through the floor to reach the rear frame of the camera and so access the wooden double half-plate dark slide ... hoping that the wind didn't whip it out of his hands. From this position the camera shutter could be operated. Lining up the shot involved either using the plane's intercom or, if necessary, writing notes. The cue between pilot and observer to take the shot was done using a piece of string. A sharp tug telling to pilot to line up and fly right, with a reverse pull giving the cue to fire the shutter.

The size and weight of the infrared camera were such that it couldn't be fitted to the aircraft for the high altitude flights, so infrared photographs were taken on subsidiary flights, after the main sorties had been completed. There was the added problem that, being outside the fuselage, the camera and lens would have frozen up at the higher altitude. Electric heaters were used for both men and equipment (these were open cockpits) but heating an external camera would have taken too much current.

Even though it doesn't mention the infrared work, the 1934 documentary film of the expedition, which won an Oscar that year, was called 'Wings over Everest' and is well worth a look. It's available to view freely on the BFI web site. There is a definite 'Boys Own' feel about the whole affair, and the BBC described observer Major Latham Valentine Stewart Blacker as being a real life Biggles. While some of the documentary was re-enacted in a typically 1930s way that we now see as being wooden, the real people are featured. But the expedition was filmed as it happened, including aerial footage shot from the cockpit which is a combination of shots from three of the flights.

Two final notes: the expedition was named after Lady Lucy Houston (pronounced How-sten), who provided funds and was quite a character herself (check out the Wikipedia entry on her) and, as the BFI points out, the while thing was inspired by novelist John Buchan.

Thursday, 10 March 2016

Infrared rainbows

It was Robert Greenler who, having deduced that there should be an infrared component to a terrestrial rainbow, finally succeeded in photographing a natural one in 1970. I've had a go myself and if the sky does what you need and the camera is steady enough it is not too difficult to give it a go yourself.

Scientific American recently published an article entitles Think You Know Rainbows? Look Again, which discussed and showed examples of rainbows that were not particularly rainbow coloured, including a white one and a red one.


The red example in the article (reproduced above) was taken from the Wikimedia Commons, and was taken by Jason Campbell in 2011. The cause, as the article explains, is that the light source is a setting sun and at this point only red light was available to be refracted by water droplets to form the bow.

You probably see where I'm going here. Let's assume that near-IR is also in the sunlight as the sun sets and that this remains a little after the red has gone. In this case it should be possible for an infrared-only rainbow to exist.

Presumably anyone with a suitable light source and some fine mist could produce one in the lab but has anyone managed to take a photograph of such a thing 'in the wild'?

[As an aside, I recently saw that the light from a rainbow is polarised along the circumference of the bow. Obvious when you think about it but I'd never noticed before.]

Monday, 16 November 2015

Graphene sheet IR sensors

A recent article in Scientific American, Ultrathin Graphene Could Improve Night-Vision Tech, reports on research at a number of labs in the United States on a method to produce room temperature thermal sensors based on 'sheets' of graphene. The original paper abstract says ...
By integrating graphene based photothermo-electric detectors with micromachined silicon nitride membranes, we are able to achieve room temperature responsivities on the order of ∼7–9 V/W (at λ = 10.6 μm), with a time constant of ∼23 ms.
Of course it's not as simple as a single sheet. The team have not only used graphene for the sensor itself but use thin strips to graphene to hold and thermally isolate the sensor from the main substrate.

Graphene-Based Thermopile for Thermal Imaging Applications
Allen L. Hsu, Patrick K. Herring, Nathaniel M. Gabor, Sungjae Ha, Yong Cheol Shin, Yi Song, Matthew Chin, Madan Dubey, Anantha P. Chandrakasan, Jing Kong, Pablo Jarillo-Herrero, and Tomás Palacios
Nano Letters 2015 15 (11), 7211-7216
DOI: 10.1021/acs.nanolett.5b01755

Thursday, 8 October 2015

October odds and ends

A couple of items for your interest.

Shutterbug published a lovely appreciation of Sir Simon Marsden on October 5th. I also found him very communicative and helpful at the time of the Centenary and really regret not meeting up, especially now I live a lot closer than I did in 2010. This has prompted me so that this year I've decided to desert Ansel Adams and get my 2016 calendar from the Marsden online shop.

Rather oddly, considering the piece dates back to August 2014, New Scientist just tweeted a link to an edition of their 'Last Word' column, which asks 'How far beyond the visible spectrum does a rainbow extend?'. The responses discuss both UV and IR extensions of what we see, and especially how those might be more dominant on other worlds such as Titan. It reminds me of the work of Robert Greenler who, having deduced that there should be an infrared component to a terrestrial rainbow, finally succeeded in photographic a natural one in 1970.

I recently had a visit from Ed Thompson and had a sneak preview of his upcoming book of colour infrared photographs. I'll write more about this when it's published but suffice to say there's lots of red and a delightful conceit in the way the book is packaged. In the meantime if you're in the vicinity of the Rough Print Gallery (14 Bradbury Street, Dalston in London) then images from the Red Forest and The Village portions of his epic Unseen project will be on show. The gallery Tumblr stream tells us that it's part of the White Rabbit Restaurant and the gallery is open 10-5 Wednesday/Thursday and during the restaurant opening hours. Starts 15th October and runs to the 21st.

I think that'll do for the moment.

Thursday, 28 May 2015

Your veins are the key

Given that near infrared can penetrate a little way into skin, there are an increasing number of interesting applications that exploit that. I noted a medical system for locating veins last November.

Now a Swiss company is demonstrating a sensor that uses NIR to scan the pattern of veins on your wrist, which are apparently unique to you, to use as a biometric key. There's more on the BIOWATCH web site and also in a news item on the BBC web site.

After recent problems that Apple's watch infrared sensor found with tattoos, it'd be interesting to know whether they'd cause a problem for this sensor. Presumably unless there's total coverage, there will still be some vein pattern to use. I assume there's a patent and a quick search throws up some interesting examples, such as US 6799726 B2 from 2000, which uses near-field radio in a wrist-watch to access ski lifts, and WO 1988004153 A1 from 1987, which concerns biometric sensing. As this latter patent points out, often this requires a user to carry out a special action, such as looking into something or placing a hand on something. With the advent of sensors small enough to fit onto a wristwatch strap, the sensing can be genuinely unobtrusive.

Friday, 6 March 2015

Infrared and tissue penetration

A recent article in New Scientist caught my eye. My drug-filled nanospheres heal at the speed of light reports work by a team led by Professor Adah Almutairi at the University of California, San Diego.

Her work explains that by making use of near infrared's ability to penetrate skin and tissue, it is possible to use a laser of the appropriate wavelength to trigger a polymer nanotube to break down and, if it's carrying a drug, to release it. Since the light can be tightly targeted it would be possible to therefore tightly target a release site for the drug. There are other mechanisms for triggering release, such as the temperature of inflammation or even sunlight on the skin. This latter has the neat prospect of a sunscreen that activates when you get into the sun.

While the NS article is very recent, I found a paper from 2011 by Almutairi's team that explains the concept:

Low Power, Biologically Benign NIR Light Triggers Polymer Disassembly Fomina et al
Macromolecules, 2011, 44 (21), pp 8590–8597

You can see the abstract or buy the paper from ACS Publications.

One interesting thing, for me, is the statement at the top of the abstract that "Near infrared (NIR) irradiation can penetrate up to 10cm deep into tissues". Admittedly, from a photographic point of view you need to remember that the IR has to penetrate the tissue and then get back out again, but I believe the figure of 'a few millimetres' has been a good rule of thumb for years. A figure of between one and two cm has been cited from a paper by Gao et al, In vivo cancer targeting and imaging with semiconductor quantum dots from 2004.

So I decided to see what Lou Gibson had to say on the matter in his third edition of Clark 'Photography by Infrared' in 1978. He quotes Balderry and Ewald, in a 1924 paper called 'Life Energy in Theraputics' as saying that sunlight can penetrate up to 25 cm into the body. So that 10cm seems quite reasonable.

Photographically, however, a near infrared photograph will often show veins under the skin, and will almost always give people a 5-o'clock shadow (even some women). This is also, as I've pointed out before, the cause of the alabaster look you can see in infrared portraits. I'll leave you with this image of Jude (a lady with whom I used to work) demonstrating the infrared look, with the added 'bonus' of 35mm infrared film grain.

Tuesday, 3 March 2015

More on visual sensitivity to IR

Following on from the experiment to extend human vision into the near infrared (part of my previous post) I found related papers in the Journal of the Optical Society of America and Acta Physica Polonica.

The first from OSA:

Visual sensitivity of the eye to infrared laser radiation
David H.Sliney, Robert T Wangemann, James K Franks (all U.S. Army Environmental Hygiene Agency, Aberdeen Proving Ground, Maryland) and Myron L Wolbarsht (Duke University Eye Center, Durham, North Carolina)

JOSA, Vol 66, Issue 4, pp. 339-341 (1976)

You can find the paper online (membership/paywall) on the Optics Infobase

The abstract reads:
The foveal sensitivity to several near-infrared laser wavelengths was measured. It was found that the eye could respond to radiation at wavelengths at least as far as 1064 nm. A continuous 1064 nm laser source appeared red, but a 1060 nm pulsed laser source appeared green, which suggests the presence of second harmonic generation in the retina.

Similarly, but more recent and with open access is ...

Perception of the Laser Radiation for the Near Infrared Range
D. Kecika, J. Kasprzaka, (both Department and Ophthalmology Clinic, I Medicine Faculty, Medical University of Warsaw) and A. Zając (Institute of Optoelectronics, Military University of Technology, Warsaw)

ACTA PHYSICA POLONICA A, Vol 120, No 4, pp 686-687

This paper is available as a PDF.

This abstract reads:
During the diagnostic research done by means of optical devices equipped with radiation sources from the near infrared range the phenomena indicating the perception possibility of this range by a human eye were observed. In this contribution the initial results of the research of this phenomenon were presented. Sources of radiation applied in laser polarimeters (785 nm) and devices designed for optical coherent tomography (820, 850 nm) were taken into particular consideration. Perception tests with the use of a laser diode generating at the wavelength of 940 nm were also carried out. It was stated that the radiation from the range examined can be recorded by a human eye giving a colour sensation — in practice independently of the wavelength of the radiation beam falling into a retina.
I suspect this kind of experiment with lasers falls into the don't try this at home category but it demonstrates that the usual limits quoted for human visual sensitivity are not necessarily the only ones.

There's some interesting notes about this also in the Wikipedia entry on light.

The final word may have to go to a paper I found in the Proceedings of the National Academy of Sciences of the United States of America (PNAS vol 111 no 50) with (sorry) too many authors to list (let's just say Palczewska et al) and with the title Human infrared vision is triggered by two-photon chromophore isomerization. Almost totally out of my area of expertise but if you have the knowledge then you can access the paper (paywall) via this link. I'll leave it to the report's authors to have the final word ... quoted from their 'significance'.
This study resolves a long-standing question about the ability of humans to perceive near infrared radiation (IR) and identifies a mechanism driving human IR vision.

Wednesday, 11 February 2015

More snippets

Apologies for not posting yet this year ... but here are a few items to make up for it.

Towards the end of last year I came across a claim that a special diet could extend human vision into the edge of the near infrared.

Petapixel carried an explanation of the research project and also a rebuttal by a neuroscientist. The original crowd-funded experiment page is on experiment.com and the group carrying out the research is called Science for the Masses. Since last August the web seems to have gone quiet on the project.

A slight increase in deep red sensitivity would be useful for astronomers wanting to view the universe at the wavelength of hydrogen-alpha: 656.28 nm. Canon produced a camera modified to give similar better response a few years back and Nikon have now also done so, although theirs is a high-resolution full-frame camera. It's the D810A. The older Canon still had some infrared filtering in place so it couldn't be used for infrared photography, but it is unclear whether this is the case with the Nikon. The press release is unclear although DP Review suggests that there is still filtering.

For those of you interested in the BBC's natural history infrared shooting, there is a training film on line where Colin Jackson explains his technique. However, this his team moved on to using modified Canon DSLR cameras rather than 'pure' video cameras so the film is a little out of date.

Finally, a thermal imaging video showing cloud formations across the earth, shot from space at a wavelength of 6.5 µm. (This is worth expanding for a better view.)


Wednesday, 4 June 2014

Animals as thermal 'detectors'

Thermal imaging cameras being as expensive as they are, it's interesting to see how observation of animal behaviour can provide clues to some aspects of temperature distribution in the environment. The classic is how to detect that a car has only recently arrived at a property ... if you can't touch the bonnet (hood). A thermal camera will show a warm spot on the bonnet, or (in classic detective mode) you might see that a cat has decided to lie there. Pigeons can also be useful detectors. You will often see birds roosting on top of one building in a group and this is often simply because that roof is warmer; either because of a local heat source such as an air outlet or because that roof is less insulated than the others. Snow will fulfil the same function.

A similar, but inverse, phenomenon has been observed by researchers in Australia. They were using thermal imaging to study how koalas regulate their body temperature, given the hot climate. Hugging trees was one mechanism, the tree trunk being cooler than the surroundings. The animals were observed moving from the top leaves where they feed in winter down to cooler parts of the tree in summer. Conveniently this would provide a perch where the koala could either lie spread on top of a shady branch (as some big cats are seen to do) or wedge themselves in a junction between a large branch and the trunk. Thermal imaging revealed how the koala uses the tree trunk and/or branch as a heat sink.

The study is published (freely accessible) in the Royal Society journal Biology Letters [Biol. Lett. June 2014 vol. 10 no. 6 20140235] and you can read (and see) more on the BBC web site which also outlines other research into how animals can exploit microclimates in trees and other means to combat high temperatures.

Similar themes can be explored from an earlier blog post on toucans and trees.

Thursday, 1 May 2014

All About Imaging: Transactions

All About Imaging: Transactions, according to its agenda, is an interdisciplinary symposium exploring new directions in contemporary imaging from the perspectives of art, science and technology. It takes place at the University of Westminster Harrow campus on May 22nd and 23rd.

I note it here mainly because of a session on day one which includes a presentation on 'Imaging the invisible in medicine' by Professor Francis Ring. Prof Ring is one of the pioneers of thermal imaging, with a special interest in medical applications.

That said, there is much of interest, including the opening of the RPS PRS International Images for Science 2103 exhibition on the 22nd. The full schedule and more information can be found on the university's web site. Members of the Royal Photographic Society are entitled to a discounted registration fee.

Saturday, 5 April 2014

Improvements to graphene detectors

Last June, I noted research in Singapore which promised a wide-ranging imaging sensor based on graphene. The hyperspectral detection of graphene ranges from ultraviolet to far infrared but there has been a problem with very low sensitivity of the single layer of carbon atoms.

A paper in Nature Nanotechnology, published on March 16 2014, outlines a method devised by researchers at the University of Michigan whereby electrons freed by photons hitting a first layer of graphene tunnel through an insulating barrier layer and into a second graphene layer. This affects current flowing through the second graphene layer and this is what is detected. The result is a dramatic increase in sensitivity as well as IR detectors that perform well at room temperatures. This is all explained in a press release from the University of Michigan. The 'trick' was to look into how the signal could be amplified, rather than making the signal itself stronger. (I haven't read the whole paper but I would ask what noise does this generate.)

"We can make the entire design super-thin," said Zhaohui Zhong, assistant professor of electrical and computer engineering at Michigan, and one of the inventors. "It can be stacked on a contact lens or integrated with a cell phone." This mention of contact lenses led the Register to ask "Want to see at night? Here comes the infrared CONTACT LENS".

A patent, 'Photodetector based on double layer heterostructures', has been applied for.

Tuesday, 4 February 2014

RPS Journal Archive online

It's a momentous event for photographic historians. The Royal Photographic Society, who have just launched their revamped web site, have also put a fully searchable archive of the famous Phot J ... the Photographic Journal, now the RPS Journal, on line with free access and free text search. It's at archive.rps.org

You can find out the background to the project on the Townsweb Archiving blog.


I haven't really explored yet but the page scans look very good. My only comments are that you only see your results by Journal volume whereas a date would be nice, and to return to the search results you have to use the browser 'back' button: but that's being picky. It is a fantastic resource. Enter 'infra-red' and then 'infrared' as your search term and see how it all started.

Wednesday, 30 October 2013

Frogs and leaf growth

Back in January I noted research by into infrared reflectance of insects carried out by Michael Mielewczik and others. Michael has contacted me again about two more papers on similar subjects.

The first is Non-Invasive Measurement of Frog Skin Reflectivity in High Spatial Resolution Using a Dual Hyperspectral Approach [1] (on PLOS ONE here with a PDF here).



As before, the team used a camera with filtering that split near-infrared (specifically the red-edge between 675-775 nm) and blue to explore the 'colour' of frog skin. They also used a two further hyperspectral cameras sensitive to visible and near-infrared between 400 and 1000 nm and to SWIR (short wave infrared) between 1000 and 2500 nm. This image is of agalychnis callidryas using the red-edge camera.

I've come across hyperspectral cameras before and they're quite fascinating devices. They produce a multi-dimensional image where each of the pixels in the x and y plane have a complete spectrum recorded in the z axis ... so z records intensity at a range of wavelengths. This means that you can choose which wavelength (or wavelengths) to view the scene after the fact. This multiplies the amount of data dramatically of course.

The second paper uses infrared imaging to help a study of leaf growth. The paper is Diel leaf growth of soybean: a novel method to analyze two-dimensional leaf expansion in high temporal resolution based on a marker tracking approach (Martrack Leaf) [2], available on the Plant Methods web site. This study used dark beads attached to the margins of a leaf and a camera fitted with a 940nm narrow bandpass filter. At this wavelength the leaf is brighter than the beads which makes image analysis easier.

[1] Pinto F, Mielewczik M, Liebisch F, Walter A, Greven H, et al. (2013) PLoS ONE 8(9): e73234. doi:10.1371/journal.pone.0073234
[2] Mielewczik M, Friedli M, Kirchgessner N, Walter A. Plant Methods 2013, 9:30 doi:10.1186/1746-4811-9-30

[Note: corrected information about the hyperspectral camera added 31 October]

Wednesday, 23 October 2013

The physics of near-infrared photography

Klaus Mangold (a photographer), Joseph A Shaw and Michael Vollmer (who are physicists) have just published a paper, The physics of near-infrared photography in the European Journal of Physics. This is the best technical paper on the subject that I've seen since Clark's book Photography by Infrared (which went out of print in 1984).

The European Journal of Physics has a policy of making papers freely available for 30 days from publication, although you will need to set up an online account to access it.

Amongst other things the paper tells us that red wine, Diet Coke and even espresso coffee are transparent to near-infrared wavelengths.

This is the URL: stacks.iop.org/EJP/34/S51

The citation is Eur. J. Phys. 34 (2013) S51–S71

Saturday, 1 June 2013

Graphene sensor offers better visible and near-mid IR imaging

A research team in Singapore has developed an imaging sensor made from graphene which promises to have better light-gathering over a wider spectrum and be cheaper then existing sensors such as CMOS and CCD.

A paper in Nature Communications, Broadband high photoresponse from pure monolayer graphene photodetector [abstract], outlines the work although you have to subscribe/pay to access the paper. There is more explanation at phys.org.

A patent is being applied for to cover this technology and the team, led by Assistant Professor Wang Qijie at Nanyang Technological University, will be looking for industrial partners in order to turn this into a commercial product.

This development could lead to cheaper cameras with a range into the mid infrared, which is useful for a range of applications including the reflectography used in art restoration, and the extra sensitivity across its whole response will come in handy as well. A significant achievement ... adding another string to graphene's improbable bow.