3D images: acquisition, synthesis and visualisation
The IN'Tech technology watch club is hosting a day of forums between researchers and industrial operators on the topic of "3D images: acquisition, synthesis and visualisation", on Thursday, 14 October 2010, from 8:30 a.m. to 5:30 p.m., at the Inria centre in Montbonnot.
- Date : 14/10/2010
- Place : Inria Grenoble - Rhône-Alpes, Montbonnot
- Organiser(s) : Inria Grenoble - Rhône-Alpes et Association GRILOG
Welcome: 8:30 a.m. - 9:00 a.m.
Introduction to the day
By P.J. Crepin, ORA Cluster, and Philippe Broun, Inria
9:20 a.m. - Three-dimensional visual perception in humans
Pascal Mamassian, Psychology of perception lab, Paris
The perception of space and three-dimensional (3D) shapes in humans is based on the use of multiple indices present in retinal images. The binocular disparity index is undoubtedly the best known and one of the most reliable, but other indices pertaining to movement or shadows complement this binocular index, especially for greater depths. This talk will present some fundamental properties of 3D perception from various indices as well as recent results on the binocular index and interactions between these indices.
10:00 a.m. - 3DLive project: shooting for 3D images
Patrick Defay, Thalès Angénieux
3DLive is a collaborative project bringing together industrial (Orange Labs, Thales Angénieux, Technicolor, Binocle, AMP, Grass Valley) and academic (Inria, Institut Telecom) partners and aims to control the recording, processing and transmission of live cultural or sporting events in stereoscopic 3D. Each of the partners represents the best in know-how covering a part of this process and contributes its capacities in operational terms or in research and development.
The presentation will address the recording aspect, for which cameras, objectives, camera rigs, and real-time 3D staging and correction software must be specially developed for the transmission of an event in stereoscopic 3D.
10:30 a.m. - 10:50 a.m.
10:50 a.m. - Stereoscopic video correction and adaptation to the display device
Frédéric Devernay, Inria
A stereoscopic film shot with a standard pair of cameras generally presents geometric imperfections that could produce visual fatigue for viewers. We will examine various factors of visual fatigue and will propose solutions making it possible to correct videos through digital processing in order to make them more acceptable. The analysis of the geometry of the filming system and display device shows that a stereoscopic video is necessarily produced for a given display device (for example, a 3D television screen), and the display of the same film on another device (for example, a cinema screen) risks producing significant geometric distortions and causing visual fatigue. We therefore also propose treatments to adapt the stereoscopic video to the display device in order to avoid visual fatigue while maintaining the proportions in terms of scene depth.
11:30 a.m. - Moov'3D project: A miniaturised system for 3D applications
Anne Guerin, GIPSA Lab, Grenoble
The "Moov3D" project is a three-year project of the "Minalogic" competitiveness cluster grouping together industrial partners – ST-Ericsson (project sponsor), MicroOLED, VisioGloble and Pointcube – and academic partners – CEA-LETI-Lab ITUS, CNRS-GIPSA-lab. This project raises the issue of introducing stereoscopic 3D to mobile phones and intends to resolve the related technological barriers in order to provide a development platform to application developers. Thus, the project strives to create a development platform with all of the hardware and software functions required to record, process and view 3D on telephones or other mobile equipment. This platform will be used to efficiently develop applications using stereoscopic 3D, such as the recording and viewing of stereoscopic photo/video content, and virtual reality and augmented reality applications. The objective is for this platform to be a benchmark because of its processing power (up to high definition), its connectivity to the various existing 3D viewing systems (lounge screen or stereoscopic glasses for mobile applications), and its level in terms of image quality by integrating as far as possible the characteristic elements of our visual perception.
12:00 p.m. - 3D vision, stereoscopy and immersive human-computer interfaces
Aléxis Nédélec, ENIB
After an introduction on the perceptive factors of 3D reconstruction by the visual system, this presentation will set out the principles of stereoscopic vision, their implementation in stereoscopic display systems, the fields of application for stereovision, as well as a taxonomy of existing 3D display systems.
Lunch - Explore stands and session posters
12:40 p.m. - 2:15 p.m.
- 4D View Solutions (Richard Broadbridge, co-founder and chairman of 4D Views - www.4dviews.com) 4D Views, innovation in 3D scene digitisation 4D Views provides multi-camera video acquisition solutions, making it possible to digitise a real, dynamic scene in three dimensions in real time. The 3D scans produced by this solution contain colour information about the digitised scene as well as its geometric information. This representation thus permits the real-time observation of the scene in new viewing angles, other than those provided by the cameras filming it. 4D Views sells its solutions to public and private research organisations abroad and offers 3D capture services to the film world.
Visioglobe (Mr Cédric Manzoni, Mr Hector Briceño) VisioStreet, 3D interactive city navigation on mobile phones In a world where smartphones are becoming omnipresent and users need to remain connected, we offer VisioStreet®, an application that allows the user to navigate within a city in 3D.
The application can be used by local city residents, tourists or business people, to find friends, restaurants, and much more.
The technology can be used in both outdoor and indoor settings, and real-time advertising and social networks are directly integrated into the visual data.
Binocle (Mr Sergi Pujades) Stereoscopic camera: 3D rig The conference will focus on the presentation of Binocle filming tools: the Brigger III (3D rig) and the DisparityTagger. The first is a versatile motorised unit which precisely and effectively controls the position of cameras. This relative position will determine the 3D effect. The second tool controls and corrects stereoscopic flows in real time. The DisparityTagger incorporates algorithms resulting from a technology transfer by Inria Grenoble Rhône-Alpes.
These tools are currently used in filming: in cinema, for example on the French feature film "Behind the Walls", or in BroadCast, for example on the Six Nations Rugby matches.
BINOCLE develops and offers to cinema professionals the most innovative production and post-production tools, dedicated to 3D staging. A production company specialising in the shooting and treatment of stereoscopic images, Binocle 3D relies on strong expertise in the field of cinematographic shooting and digital technologies.
- Thales Angégnieux (Mr Patrick Defay) Application of 3D observation to the field of infrared imaging Thales Angénieux is the internationally recognised manufacturer of high-tech optical and optronic systems, specialising in the design and manufacture of zooms for cinema and television, for surveillance and security as well as in night-vision goggles. Today, Thales Angénieux is the European leader in night vision, mastering the technologies of light intensification and infrared. The innovation capacity of Thales Angénieux now makes it a major player in the development of 3D.
A waveguide for 3D
Adrian Travis, Microsoft Corporation
Estimation of depth through sharpness analysis
Jérôme Dias(1), Gergely Papp(1), Stéphane Gétin(1), David Alleysson(2); (1) CEA, LETI, MINATEC (2) LPNC, Université de Grenoble
Exchanges and work groups
Closing of the seminar
Summary of the day in plenary
Conclusion: ORA cluster and Grilog