Current knowledge

Project » Current knowledge

Everything started with the famous 1935 paper of Einstein, Podolsky, and Rosen [Einstein1935]. They asked whether quantum mechanics could be considered complete. While investigating this problem they discovered that pairs of quantum objects may have perfect correlations in complementary observables (in the original EPR case both in positions and momenta). They erroneously conjectured that such quantum correlations suggest existence of additional variables - (local) elements of reality (a form of hidden variables). The error was shown by Bell [Bell1964], who has demonstrated impossibility of a description of quantum correlations (due to entanglement) with local hidden variables of any type. 1980's and 90's have brought a continuing growth of interest in Bell inequalities, and first experimental attempts to falsify local realism attempting to close the main experimental loopholes [Aspect1982]. Exactly quarter of a century after Bell's result we learned that the con ict between local realism and quantum mechanics is getting exponentially stronger with a number of subsystems under consideration - GHZ theorem [Greenberger1989]. Also, with a pioneering works, starting with Bennett [Bennett1993] demonstrating the possibility to teleport an unknown quantum state and Ekert [Ekert1991], who has associated a secure quantum extraction of a cryptographic key with Bell inequalities, entanglement, and in particular its potential to falsify Local Realism, emerged as a resource for more effcient processing of information [Scarani2001, Brukner2004, Sen2004]. The enhancement is a consequence of correlations generated by entangled states, which cannot be mimicked by local actions and classical communication. In a parallel development, quantum information processing has been demonstrated to be useful in effcient performance of certain computational tasks, provided one is able to construct a quantum computer [Deutsh1992, Grover1994, Shor1996].

Even simplest generalizations of the original Bell inequalities lead to a stronger contrast between Local Realism and  Quantum Mechanics. Hence in recent years many groups focused on systematic deriving new forms of the Bell theorem. There thre main directions of this development: finding inequalities for more particles [Mermin1990, Ardehali1992,Belinsky1993, Werner2001, Weinfurter2001, Zukowski2002], more settings [Pitowski2001, Sliwa2003, Laskowski2004, Z_ ukowski2006], or higher dimensionality of subsystems [Kaszlikowski2000,Collins2002, Zohren2008]. These results have proved that there are many new, intriguing effects concerning non-classicality of entangled states.We cannot claim, however, that the structure of the set of entangled states and Bell inequalities is fully understood. For example, we want further explore properties of Bell inequalities involving lower-order correlations, which will allow new non-classical phenomena.

Additionally, this structure is linked with fundamental relations between correlations for various measurements. These limitations may prevent a simultaneous violation of two different Bell inequalities for two different, but overlapping sets of elementary subsystems [Toner2006]. Recently we have developed a new tool for studying these complementarities [Kurzynski2010]. It is certainly interesting to continue the research in this direction to establish communication and cryptography protocols with different security levels. Modern photonic experiments in quantum information processing rely most often on parametric down-conversion, for a review see [Pan2008]. It is a non-linear optical process generating pairs of strongly correlated (in frequencies, polarizations, etc.) photons. However, if we want to use more such pairs simultaneously, and to "construct" more complicated multi-photon entangled states, we face fundamental limitations in precise tailoring profiles of photons originating from different pairs. An analysis of such effects was studied e.g. in [Laskowski2009]. Full understanding of these and learning to control their effects are a key for the success of future realizations of multi-photon schemes.This motivates our interest in studying more advanced techniques of noise control in parametric down conversion, and other developments of the sources (cavity enhancement, optical fiber sources, periodically pole crystals, four wave-mixing-based techniques).

An important step in application of any quantum technology is testing if a state produced by a source is suitable for the given protocol [Horodecki2009]. The basic question we are the most interested in is whether, and in what way the state is entangled. A convenient method to provide such an answer is to use entanglement witnesses (EWs) [Terhal2000]. Originally, EWs were understood as linear operators with a positive or zero mean values for all separable states, but it takes negative values for some entangled states. It was soon realized that two trends in research on EWs are important. One is construct more economic entanglement criteria, which allow to detect non-classical resources at a minimal cost from observers, in terms of necessary measurements [Lewenstein2000]. This is in contrast to quantum state tomography, which requires processing a large amount of experimental data. The other trend is to optimize EWs so that they can detect as many states as possible. It is possible to construct witnesses to detect specific states. In this respect, non-linear criteria of entanglement [Guehne2006, and a completely different approach in Badziag 2008] turn out to be more powerful than linear witnesses. Hence we are interested in developing new entanglement tests, which will be simpler to apply or more universal. The recent quantum information boom has been not only related to the processing of quantum information per se, as in the case of quantum computing and quantum cryptography, but also affected other areas of physics, providing a new methodology. One prominent example is merging the quantum information methods with many- body theory, resulting in new tools to simulate properties of many body systems. The other emerging field is a so called "quantum thermodynamics", which has emerged earlier, independently of quantum information see e.g. [Pusz1978,Lenard1978, Alicki1978, Goldstein2001] (Robert Alicki from our group was one of pioneers of modern approach to thermodynamics) but which is now in the centre of interest of many quantum information groups [Popescu2009,delRio2011].

Although thermodynamics is an old and well established domain, there is still a lot of confusion in the literature, regarding the precise notion of work, some false alarms concerning violation of second law in strong coupling regime (see e.g. references in one of the papers refuting possible violations of second law [Hilt2011]). To recall one possible source of confusion, the work that can be drawn from a system is defined as the maximal possible energy decrease, that can be obtained by applying a unitary transformation. This definition does not explicitly explain how to actually get this work, and in case of micro-systems, it is by no means clear whether such an amount of work can be obtained. As a matter of fact, refining definition of work is needed: to recall recent attempt to approach this question, in [Linden2010] a model for micro-engines was proposed, where the proposed definition of work is by no means convincing. To summarize, while the quest of probing eciency of thermal micro-machines is now extensively investigated both in theory and in experiment [Erez2010], however the results are shadowed by lack of precise picture of thermodynamical notions in micro-regime. A possible approach, that could contribute to clarifying picture, which was not yet explored, was pioneered in [Janzing2000], where thermodynamics was presented in analogous way to resource theories from quantum information theory, where the main question is: "can state be transformed into state by given class of operations?". Quite recently, a member of our group Michal Horodecki in collaboration with Jonathan Oppenheim (Cambridge University, moving to University College London) and other authors [Horodecki2011], has merged the concept of [Janzing2000] with the knowledge from quantum information processing, in particular perhaps the simplest possible resource theory proposed in [Horodecki2003]. In the paper, a maximally mixed state as a free resource was considered, and unitary operations as possible operations. It was obtained that the state can be transformed into other state, in asymptotic regime, if and only if the latter has smaller relative entropy with respect to the maximally mixed state than the former one. In [Horodecki2011] we were able to generalize this to a thermodynamical setup, where the free resource is the heat bath in a Gibbs state, and the operations are unitary maps commuting with total Hamiltonians. In the limit of many identical copies, we have obtained the inter-conversion law governed by difference of free energy between the given state and the heat bath, thus reproducing thermodynamical laws.

The current knowledge concerning experimental multiphoton effects of possible technological applications is reviewed in depth in [Pan2008], which is delived as a part of the project application (as one of the three papers of the applicant).