Utilisation de la détection cohérente pour les systèmes de transmission optique à 40 GB/s ET 100 GB/s

Ever since the development of the internet and the introduction of the world wide web (www), the demand for data capacity has increased exponentially. Current predictions foresee indeed a long term traffic growth of about 50% per year [1], driven by heavy-content applications such as Youtube, Skype, Google earth, Facebook, and video-on-demand services, for example. The data transmission over optical fibre has revolutionised the telecommunication landscape and has played a major role in the advent of the information age.

Telecommunication providers employ fibre-optic communication systems for most of the backbone, regional and metropolitan transmission links in order to respond to this need for capacity. Fibre-optic communications systems have not stopped evolving since the first commercial demonstration of live telephone traffic over optical fibre on 22 April, 1977, carried out by the General Telephone and Electronics company. Different key enablers have progressively contributed to overcome the encountered bottlenecks such as the introduction of single mode fibres, the invention of optical Erbium-doped fibre amplifiers (EDFA) allowing the efficient deployment of wavelength division multiplexing (WDM), the utilisation of forward error correction (FEC), and the arrival of chromatic dispersion management techniques. Typically, the transmission distances in the metropolitan, regional and backbone networks are less than 300 km, between 300 and 1,000 km, and more than 1,000 km, respectively. This thesis focuses on these long-haul fibre-optic systems used in backbone networks. Today’s standard long-haul terrestrial systems employ single mode fibres, WDM with 50-GHz channel spacing, optical amplification, dispersion management and forward error correction. Due to the limited bandwidth of current optical amplifiers, these systems carry typically up to 80 or 100 WDM channels.

Until recently, simple intensity modulation of light combined with direct detection has been the choice for commercial optical transmission systems. This option combines cost-effective transmitter and receiver structures with a sufficient transmission performance enabling the realisation of 10-Gb/s long-haul transmission systems. Nevertheless, to keep responding to the ceaseless increasing demand for capacity, higher bit rates per channel are required, namely 40 Gb/s and 100 Gb/s. The challenge is that fibre-optic transmission becomes generally more are more difficult with the increasing bit rate since the requirements on the end of link optical signal-to noise ratio (OSNR) increases whereas the robustness against linear transmission impairments, namely chromatic dispersion and polarisation mode dispersion (PMD), is drastically decreased. In this context, phase shift keying (PSK) modulation formats using (direct) differential detection have been widely studied over the last decade. However, the tolerance to linear impairments remains one of the major concerns for their field deployment. Therefore, a disruptive solution allowing the capacity increase while being robust against linear impairments is required.

Sometimes, disruptive solutions do not come from really new discoveries but from the innovative association of available technologies. Thus, optical coherent detection was firstly proposed in the 1980’s to improve receiver sensitivity. At that time, no optical pre-amplification was used in front of the receiver and the detection was mostly limited by the thermal noise of the photodiodes and electrical amplifiers. Hence, a local oscillator much more powerful than the signal allowed for coherent amplification. However, the development of the EDFA as an optical pre-amplifier implied the decline of the interest on coherent detection. It was not until the recent advances in digital signal processing and high speed electronics, especially in analogto-digital-converters, that coherent detection has revealed as one of the most powerful tools to achieve high-speed optical transmissions. By providing access to the amplitude, the phase and the polarization of the optical field, coherent receivers enjoy the excellent sensitivity to optical noise associated with homodyne detection with the added benefits of advanced digital signal processing. Thus they offer the possibility to operate with unlocked local oscillators and to compensate for linear impairments making high bit rate transmissions possible. Moreover, digital coherent receivers allow detecting polarisation division multiplexed (PDM) signals without any extra component at the receiver side. In comparison to singly-polarised signals, PDM can double the spectral efficiency, i.e. the number of bits transmitted per second per Hertz, as it modulates independent data onto each of the two orthogonal polarisations of the optical field. Besides, PDM halve the symbol rate with respect to singly-polarised signals for a given bit rate.

On top of linear fibre effects and optical noise, optical communication systems may be limited by nonlinear fibre effects stemming mainly from the Kerr effect. The penalties caused by nonlinearities may depend on channel bit rates, modulation formats and detection techniques. Hence, the behaviour against nonlinearities of polarisation-multiplexed signals detected with coherent receivers may differ from that of current (singly-polarised) intensity-modulated direct-detected schemes, due to the fact the two modulated polarisation tributaries may interact nonlinearly along the transmission, for example. Besides, most of carriers do not intend to build specific networks from scratch for the new data rates and they require the future upgrade at higher bit rate to be compatible with their (legacy) 10-Gb/s networks. Their motivation is to smoothly respond to the predicted traffic growth of about 50% per year mentioned above. Therefore, one likely scenario for upgrading 10-Gb/s infrastructures is to progressively insert higher bit rate channels among other 10 Gb/s ones and propagated them links designed for 10-Gb/s operation with 50-GHz channel spacing. Hence, the key parameters to determine the suitability of one solution for overlaying existing 10-Gb/s legacy networks will be its tolerance to linear distortions (chromatic dispersion, PMD and narrow filtering) and the correct cohabitation with co-propagating 10-Gb/s non-return-to-zero on-off keying (NRZ-OOK) channels.

In the telecommunications landscape, optical fibres are massively used as a physical medium to transfer information from one point to another by means of an optical signal. Optical fibres are cylindrical nonlinear dielectric waveguides commonly made of silica glass. Since Kao and Hockham proposed their use for telecommunications at optical frequencies in 1966 [2], they have not stopped to evolve [3]. The most common structure consists of three concentric cylindrical sections: the core, the cladding and the coating,  The light is confined within the core through total internal reflexion thanks to a difference between refractive indexes of the core and the cladding [4]. The core refractive-index of silica fibres is n1~1.48 whereas the refractive index of the cladding, n2 , is between 0.2 and 3% lower,  This difference between the refractive indexes can be achieved either by increasing the refractive index of the silica within the core, through Germanium-Oxide (GeO2) doping, or by reducing the refractive index of the cladding, through fluoride (F) doping.

Depending on the number of existing modes of propagation, one can talk about single-mode (one mode) or multi-mode (several modes) fibres. The number of modes supported by an optical fibre at a given wavelength depends on its design parameters, namely it scales with the core radius and the difference of refractive index between the core and the cladding. Single-mode fibres are the most widely used today, avoiding thus detrimental pulse broadening induced by intermodal dispersion. The most commonly employed fibre for telecommunications purposes is standard single-mode fibre (SSMF). The radius of its core is 4.5 µm which ensures single-mode transmission in the telecommunications bandwidth around 1550 nm. The core of SSMF is doped with GeO2 . Since this doping slightly increases fibre losses, better attenuation characteristics can be obtained by doping the cladding with fluoride [5].

Table des matières

CHAPTER 1. THE FIBRE-OPTIC TRANSMISSION CHANNEL
1.1. LINEAR TRANSMISSION EFFECTS
1.1.1 Fibre attenuation
1.1.2 Chromatic dispersion
1.1.3 Polarisation mode dispersion
1.1.4 Polarisation dependent loss
1.2. NONLINEAR TRANSMISSION EFFECTS
1.2.1 Kerr effect
1.2.2 Self-phase modulation
1.2.3 Cross-phase modulation
1.2.4 Four wave mixing
1.2.5 Intra-channel cross-phase modulation and four-wave mixing
1.2.6 Cross-polarisation modulation
1.2.7 Nonlinear phase noise
1.2.8 Non-elastic scattering effects
1.3. SUMMARY
CHAPTER 2. FIBRE-OPTIC COMMUNICATION SYSTEMS
2.1. TRANSMITTERS AND DIRECT-DETECTION RECEIVERS
2.1.1 General transmitter and receiver aspects
2.1.2 On-off keying
2.1.3 Phase modulation formats
2.1.4 Return-to-zero pulse-shaping
2.2. POLARISATION DIVISION MULTIPLEXING
2.2.1 Transmitter scheme
2.2.2 Noise sensitivity
2.3. WAVELENGTH DIVISION MULTIPLEXING
2.4. LOSS COMPENSATION
2.4.1 Erbium doped fibre amplifiers
2.4.2 Distributed Raman amplification
2.4.3 Optical signal to noise ratio evolution
2.4.4 Trade-off between noise and nonlinearities
2.5. CHROMATIC DISPERSION COMPENSATION
2.6. A LABORATORY TOOL: THE RECIRCULATING LOOP
2.7. SUMMARY
CHAPTER 3. COHERENT DETECTION: THE COME-BACK
3.1. RECEIVER SENSITIVITY
3.2. POLARISATION DIVERSITY COHERENT MIXER
3.3. DIGITAL SIGNAL PROCESSING ASSOCIATED TO COHERENT DETECTION
3.4. SUMMARY
CONCLUSION

Cours gratuitTélécharger le document complet

Télécharger aussi :

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *