Showing posts with label correlations. Show all posts
Showing posts with label correlations. Show all posts

Thursday, April 13, 2017

Quantum imaging beyond the classical Rayleigh limit

A decade has passed since we were working on quantum imaging, as we reported in an article in the New Journal of Physics that was downloaded 2316 times. We had described the experimental set-up in a second article in Optics Express that was viewed 540 times. It is interesting that the second article was most popular in May 2016, indicating we were some 6 years ahead of time with this publication and over 10 years ahead when Neil Gunther started actively working on the experiment. The problem of coming too early is that it is more difficult to get funding.

Edoardo Charbon continued the research at the Technical University of Delft, where he built a true digital camera that used a built-in flash to create a three-dimensional model of the scene, and the sunlight to create a texture map of the image that could be mapped on the 3-d model. This is possible because the photons from the built-in flash—a chaotic light source that produces the photons from excited particles—and those from the sun—which is a thermal radiator (hot body)—have different statistics.

We looked at the first- and second-order correlation functions to tell the photons from the flash from those originating in the sun. Since the camera controlled the flash, the photon's time of flight could be computed to create the 3-d model. The camera worked well up to a distance of 50 meters.

I am glad that Dmitri Boiko is still continuing this line of research. With a group at the Fondazione Bruno Kessler (FBK) in Trento, Italy and a group at the Institute of Applied Physics at the University of Bern in Bern, Switzerland, he is working on a new generation of optical microscope systems by exploiting the properties of entangled photons to acquire images at a resolution beyond the classical Rayleigh limit).

Read the SPIE Newsroom article Novel CMOS sensors for improved quantum imaging and the open access invited paper SUPERTWIN: towards 100kpixel CMOS quantum image sensors for quantum optics applications in Proc. SPIE 10111, Quantum Sensing and Nano Electronics and Photonics XIV, 101112L (January 27, 2017).

Tuesday, March 18, 2014

Traps in big data analysis

When I was a student, I had chosen mathematical statistics as one of my majors. At the time, the hot topics were robust statistics, non-parametric methods and optimal stopping times. Descriptive statistics was not part of the curriculum (PowerPoint did not yet exist and there was no need for meaningless 3-D pie charts).

In the student houses I lived, there were always medical students at the end of their studies who had to get a doctorate. Residencies were grueling and at that time the least effort thesis was to punch in some historical medical data. On their way home from the clinic, these students would spend part of the night in the empty punch card rooms, for about 6 months.

Thereafter, they would bring the punch cards to the data center and get 10 to 20 centimeters of SAS printout—and the desperation of not knowing how to get from hundreds of cryptic tables to a one hundred page thesis.

Many of them ended up knocking on my door with the printout and scratching their head. Because in the data center the students could not tell what analyses they needed—after all, there never was an experimental design—the data center people just ran all and every function available in SAS. Classical garbage-in garbage-out.

So, I had to tell the students to stare at the data and come up with a few hypotheses, then use the ANOVA routines to confirm them and the regression routines to do a few nice graphs.

Unfortunately, after all these years we are not much better off. Indeed, now we have to deal also with "big data hubris," the often implicit assumption that big data are a substitute for, rather than a supplement to, traditional data collection and analysis. Now we have tools like Google Correlate that allow us to correlate tons of apples with megatons of oranges.

A recent interesting paper by David Lazer et al. is a nice summary of how big data analysis allows us to create more statistical garbage: Lazer D, Kennedy R, King G, Vespignani A. Big data. The parable of Google Flu: traps in big data analysis. Science. 2014 Mar 14;343(6176):1203-5. doi: 10.1126/science.1248506. PubMed PMID: 24626916.

The authors conclude: "Big data offer enormous possibilities for understanding human interactions at a societal scale, with rich spatial and temporal dynamics, and for detecting complex interactions and nonlinearities among variables. We contend that these are the most exciting frontiers in studying human behavior. However, traditional 'small data' often offer information that is not contained (or containable) in big data, and the very factors that have enabled big data are enabling more traditional data collection. The Internet has opened the way for improving standard surveys, experiments, and health reporting. Instead of focusing on a 'big data revolution,' perhaps it is time we were focused on an 'all data revolution,' where we recognize that the critical change in the world has been innovative analytics, using data from all traditional and new sources, and providing a deeper, clearer understanding of our world."

Sunday, November 14, 2010

Thursday, November 19, 2009

Mavericks are best for crowd-sourcing

Maverick is the antonym of conformist or a culturally competent person. Synonyms include: individualist, nonconformist, free spirit, unorthodox person, original, eccentric; rebel, dissenter, dissident, enfant terrible; informal cowboy, loose cannon.

When we do a psychophysics experiment the old fashioned way in a lab, we want informants that are culturally competent persons. In fact, we are very careful in writing clear instructions, make sure the informants understand them, and check they follow the rules. The experimental conditions are strictly controlled so all informants perform exactly the same experiment.

When we do a psychophysics experiment the new way on the Web using crowd-sourcing, we get all beaten up by our colleagues and our papers keep getting rejected. "You are getting all those disruptive loose cannons out there, your results are meaningless." Well, we could almost reply "consider this formula:"

mean correlation of the aggregated responses to a world standard

I have to write "almost" because James Shilts Boster started writing his paper The Value of Cognitive Diversity: The Correlation of Local Aggregates with World Standards on May 6, 2004, but then as far as I know never got around to publish it.

The formula is for the mean correlation of the aggregated responses to a world standard. rxy is the average individual informant's correlation with the world standard, rxx is the average correlation among informants on the similarity judgment task, and N is the number of informants in the pool of aggregated responses.

This formula teaches that when N is small, like in the case of the old fashioned experiment, then we get the best correlation when all informants are culturally competent. Check!

However, when N is large, like in crowd-sourcing, then each new conformist informant does not contribute much to the correlation. Instead, it is the maverick informants, or better, the disagreement among informants that allows their aggregation to closely approximate the world standard. Surprise!

With this we call all rebels, dissenters, and mavericks out there and beg them to contribute to our color naming experiment at the link of their language on this page: http://www.hpl.hp.com/personal/Nathan_Moroney/mlcn.html

Monday, March 30, 2009

Blue for research, red for development

Color scientists are notorious for surrounding themselves in gray in order not to pollute their retinas with after-images. Typically, the desktop background on their PC is gray so they can make unbiased color judgements.

Color scientists do not spend their day looking at pretty color images. Most of it is spent writing software implementing color rendering algorithms; sometimes they even do some research to come up with new algorithms. There is no imperative need for a gray desktop background.

Recent research by Ravi Mehta and Rui (Juliet) Zhu at the Sauder School of Business, University of British Columbia suggests color scientists should change their PC's desktop to blue when they are conducting research and to red when programming.

From a series of six studies reported in Science 27 February 2009: Vol. 323. no. 5918, pp. 1226 - 1229, they conclude that although people have an overall preference for blue versus red color, red can be beneficial when the focal task requires detailed attention.

The left advert version features visuals that are remotely related to the focal camera product; the right version features visuals that depict specific product details

Red is often associated with dangers and mistakes. They report, claims have been made linking the color red to the highest level of hazard and also the highest level of compliance. In contrast, blue is often associated with openness, peace, and tranquility (e.g., ocean and sky). A word association test confirmed that people indeed generate these different associations to red versus blue color in the cognitive task domain.

[Note that the red and blue may have different associations across cultures. Just replace the color names appropriately.]

Metha and Zhu propose that these different associations related to red versus blue color can induce alternative motivations. Specifically, red, because of its association with dangers and mistakes, should activate an avoidance motivation, which has been shown to make people more vigilant and risk-averse. Thus, red, compared with blue, should enhance performance on detail-oriented tasks.

In contrast, because blue is usually associated with openness, peace, and tranquility, it is likely to activate an approach motivation, because these associations signal a benign environment that encourages people to use innovative as opposed to "tried-and-true" problem-solving strategies. Indeed, an approach motivation has been shown to make people behave in a more explorative, risky manner. Thus, blue versus red should enhance performance on creative tasks.

Indeed, their study shows that red (versus blue) can activate an avoidance (versus approach) motivation and subsequently can enhance performance on detail-oriented (versus creative) cognitive tasks. When the task on hand requires people's vigilant attention (e.g., programming), then red might be particularly appropriate. However, if the task calls for creativity and imagination (e.g., a new product idea brainstorming session), then blue would be more beneficial.

Thursday, January 8, 2009

A quantum imager for intensity correlated photons

Yesterday our paper A quantum imager for intensity correlated photons was published in the New Journal of Physics. NJP is published by the Deutsche Physikalische Gesellschaft and the Institute of Physics. The link to the paper is http://www.iop.org/EJ/abstract/1367-2630/11/1/013001, where you find this abstract:

We report on a device capable of imaging second-order spatio-temporal correlations g(2)(x, τ) between photons. The imager is based on a monolithic array of single-photon avalanche diodes (SPADs) implemented in CMOS technology and a simple algorithm to treat multiphoton time-of-arrival distributions from different SPAD pairs. It is capable of 80 ps temporal resolution with fluxes as low as 10 photons s−1 at room temperature. An important application might be the local imaging of g(2) as a means of confirming the presence of true Bose–Einstein macroscopic coherence (BEC) of cavity exciton polaritons.

Tuesday, September 30, 2008

Experiments supporting the concept of a g(2)-camera

Last weekend, as like an astronaut in a Mercury capsule I sat strapped in a small seat in a metal tube being flung across the Atlantic and Canada's Northern Territories, I was reading the day's press from both sides of the Atlantic to catch up with the last two weeks of news and get an appreciation of the reality field's distortions.

On both sides of the Atlantic, physicists made first page news, but for very different reasons, as you would expect in a Riemannian reality field. In the US newspaper, a journalist had been chasing so-called financial geniuses in New York and London to get the rap on $700 billion of toxic financial papers. In the European newspapers the story was on page four, with the question of why the US Government was talking about $700 billion when the actual amount of toxic paper was $3,500 billion, or $3,500,000,000,000.00.

Anyway, that is what you get with reality distortion, but it was not what caught my attention. The journalists were not able to get any financial genius to speak on the record, so they reported remarks from both sides of the Atlantic stating that the financial instruments were so complex that there was no way they (the geniuses) could understand them, that is why they hired quantum mechanics physicists to cook up risk models.

So, there it was written black on white: the quantum mechanics physicists are to blame for the $3,500 billion toxic papers. Hmm, and I thought the only toxic paper quantum physicists handle is that in the litter box of Schrödinger's cat. And they can even not known if the cat is dead or alive.

The story about the quantum physicists would have been more believable, if they had written the $3,500 billion disappeared in a black hole when the Large Hadron Collider (LHC) was turned on in Geneva (see this article on page 1291 of Science magazine of 5 September 2008).

Science 5 September 2008: Vol. 321. no. 5894, p. 1291

That is what I read in the US newspapers. In the European newspapers physicists made the first page for completely different reasons. The first reason was the LHC. There had been some apprehension about black holes, but the operation start on 10 September was a full success. Unfortunately, over a week later, a possible faulty electrical connection between two of the accelerator’s magnets caused a large helium leak into sector 3-4, moving the start of the experiments to March 2009.

What the newspapers explained in some detail, was how beneficial the $8 billions spent on the LHC was for European industry, because it spurred a large amount of new technology in fields like superconductors and low-temperature materials. While I was reading this, I thought, wow, $8 << $3,500 billion. We could have had our own supercollider in Texas for only the bonuses of one bank in one year!?

The second front page news related to physics in European newspapers was Zhai Zhigang's space walk and the impact the development of the Shenzhou 7 capsule and its launching technology had on Chinese industry, leading it to develop more advanced technologies.

As a whole, from a European perspective, quantum physics and rocket science are not as bad as it is believed to be on this side of the Atlantic. From an international point of view, that had already been decided in the Nüremberg trials, which lets me continue with the meat of this post without shame.

It did not make the newspapers, but last week our paper on experiments supporting the concept of a g(2)-camera was published. If your institution does not subscribe to SPIE's Digital Library, you can buy it for only $18.00 (those are plain dollars, not billions).

Recent experiments have reported the Bose-Einstein condensation (BEC) phase transition for exciton-polariton systems in a semiconductor microcavity. The macroscopic quantum degeneracy is typically detected by probing the statistical properties of light emitted from a microcavity, under the presumption that the statistics of the exciton polaritons are faithfully transferred to the emanating photons.

The macroscopic quantum degeneracy can be established by measuring the correlations viz., first-order in the electric fields:

g1

and seconds-order in the electric fields:

g2

Moreover, it has been assumed that observation of the interference fringes similar to those in Michelson or Young interferometers is sufficient to establish the fact of macroscopic coherence in exciton-polariton systems. Two points on the wave front separated by a distance x12 produce an intensity pattern

intensity pattern

such that the fringe visibility measures the magnitude of the first-order correlation function g(1)(x12, τ). But simply measuring this quantity alone is ambiguous because a coherent light source (e.g., a photon laser or decaying polariton BEC) can exhibit the same first-order correlations as a chaotic (or thermal) light source (e.g. Hg-Ar discharge lamp). The table below shows that proper disambiguation of a coherent state also requires measurement of the second-order correlation function

second-order correlation function

associated with intensity noise correlations. Here, I1,2(t) is the light intensity at a point ±½ x12 and time t.

Maximal values of respective correlation functions for incoherent, coherent and thermal light states

correlation function

photon states

incoherent

coherent

chaotic

g(1)(x, 0)

0

1

1

g(2)(x, 0)

1

1

2

∆g(2)(x, 0)

0

0

1

The minimal condition to confirm the BEC phase transition in a polariton system then becomes

minimal condition to confirm BEC

Our imager detects the spatial correlation excess shown as ∆g(2) ≡ g(2)(x, 0) – 1 in the third row of the table above.

In our paper, we present a novel g(2)-imager built with conventional CMOS technology, which is capable of measuring second-order spatio-temporal correlated photons and thereby offers an important means for verifying the existence of a BEC state of cavity exciton polaritons.

Exploded micrograph of the 4x4 SPAD array

One potential limitation when imaging BECs with our device is the requirement that ∆g(2) = 0, which corresponds to a null measurement. For BEC detection, however, we anticipate that a more practical device could combine conventional g(1)-imaging with g(2)-imaging, either as the same camera operated in two distinct modes or as two distinct cameras working together.

Future work will include the development of larger arrays of SPADs, the integration of on-chip data processing based on equation

and the extension to other g(2)-imaging applications.

A surprising feature of the g(2)-camera is that the parallelism of the sensor stemming from using N detectors does not scale linearly but binomially. For example with a 4 x 4 SPAD array all 16 detectors have separate parallel outputs so that (162) = 120 simultaneous pairwise measurements are possible.

You can get the full paper from this link: http://spie.org/x648.xml?product_id=795166.