The first Article in this series focused on growth in bandwidth demand and attenuation in optical fibers. Article 2 focused on several types of dispersion that exist in fiber, followed by Article 3 – Fiber strength and reliability.
This article, the fourth in the series, will focus on single-mode fiber geometries.
When speaking about fiber geometries, we typically consider diameters of core, MFD (Mode Field Diameter), cladding and coating. Also, the concentricities of those and the ovalities – and then the actual curl of the fiber. More on that later however.
The primary impact of fiber geometry occurs in the splicing and connectorization processes. Fibers with good and consistent geometry tend to splice and connect better with lower connectorization losses than do others. However, as highlighted previously in Article 2, fiber concentricity could also play an important role for polarization mode dispersion (PMD) performance and is an important parameter. For high quality fibers geometry has been good for a long time, and we may have been so accustomed to it that we sometimes take it for granted. However, this may not always be the case.
We’ll work our way through a typical fiber specification, highlighting the importance of various single-mode fiber geometry specifications. In the first figure we present the basic properties of the geometry of a fiber.
CLADDING DIAMETER (OUTER GLASS DIAMETER)
Cladding diameter is the outer diameter of the glass portion of the fiber. For telecommunications fibers, this diameter has been 125 microns (μm) for a very long time. On the other hand, the diameter tolerance has not always been 0.7 μm.
During the 1980s, optical fibers had outer diameter tolerances as high as +/- 3.0 μm. As illustrated in Figure 2, trying to match up 8 micron fiber cores when cladding diameters varied between 122 and 128 μm in diameter could result in very high losses, since the two cores may be significantly misaligned even though the claddings of the two fibers are perfectly aligned. This situation is why fusion splicing machines required additional technology to help align the actual fiber cores. This extra technology, however, increased the price of the splicing units.
As the industry matured, single-mode fiber diameters remained the same at 125 μm. However, over the same time period, the specification tolerance declined to 0.7 μm with typical variation along the length of the fiber becoming even tighter.
From a manufacturing perspective, such tolerances were not easy to achieve. When fiber was first invented, the developers had to create manufacturing methods along with ways to measure fiber diameter. When manufacturing to tolerances of tenths of a micron, inputs such as stray air currents, vibrations or particulate in the glass can cause significant diameter variability. These factors require top- tier fiber manufacturers to have very tight control over their processes and procedures.
As diameter variability has decreased, splicing machines have reduced the alignment technologies needed. While there has been a significant decrease in the price of these machines, there has been no corresponding substantial increase in splice loss. Core alignment splicing machines still provide the best performance, however smaller “fixed V-groove” machines with lower prices and limited alignment capability have significantly reduced the performance gap. The typical splice loss for OFS AllWave®+ Zero Water Peak (ZWP) Optical Fiber, spliced using a core alignment splicing machine, is roughly 0.03 dB, whereas the same fibers spliced with a fixed V-groove machine have an average loss of approximately 0.05 dB. In a comparison of the absolute values, that is a significant difference. However, in the context of use in most fiber optic network applications, the difference is actually rather insignificant.
Enabled by tighter fiber geometry, the reduced cost of splicing machines is one of the factors that have contributed to the overall decrease in the cost of building fiber networks. In fact, this change has ultimately enabled fiber to the home to become a reality.
MODE FIELD DIAMETER (MFD)
Mode field diameter (MFD) is another specification related to fiber geometry. In a typical G.652.D- compliant single-mode fiber, not all of the light travels in the core; in fact, a small amount of light travels in the fiber cladding. The term MFD is a measure of the diameter of the optical power density distribution, which is the diameter in which 95% of the power resides.
MFD is important for two main reasons.
The first reason is that fiber bending loss is typically correlated with MFD. Unless special fiber designs are used, as the MFD increases, bend loss also increases, and vice versa. Historically, fibers with smaller mode field diameters are less bend sensitive. That being said, modern fiber designs has enabled fiber manufacturers to make bend insensitive, single-mode fibers with a nominal mode field diameter of 9.2 μm – which is the same as the vast majority of classical standard G.652.D fibers. However, there are also fibers with MFD as high as 8.6 micron offering superior bending performance, with very low bending losses even for bending diameters as low as 5 mm – which is half of the smallest bending diameter specified by ITU-T for G.657.B3 fibers.
Second, because of the ease of use, OTDR measuring instruments are often used to measure attenuation. However, OTDRs only give correct results if the measurement conditions are perfect. A sudden jump in MFD is definitely not a perfect measurement condition. So in the case where two fibers of different mode field diameters are spliced together, the OTDR will errantly show either a power gain, known as a “gainer”, or elevated loss, depending on in which direction the measurement is taken. When measured from the larger MFD into the smaller, a gainer is produced. When measured from the smaller MFD into the larger, an elevated loss is seen, as shown below. This is an artifact of the OTDR measurement method and does not affect transmission properties. Breaking and re-splicing the fibers will typically not change the result, unless there’s a bad cleave or some other anomaly at the splice interface. The correct way to measure splices overall is bi-directional OTDR, which is even more important for fibers with MFD mismatches.
This fact shows why it may be advantageous to use bend-insensitive fiber with MFDs of 9.2 micron. Since experience with installation and measurements of fibers is almost always gained working with standard G.652.D fibers with MFDs of 9.2 micron, such bend insensitive G.657 fibers will behave in a very familiar way in terms of splicing and control measurements. Especially when splicing to the installed base of 9.2 μm MFD single-mode fiber.
CORE-CLAD CONCENTRICITY ERROR
Core/clad concentricity error (CCCE or CCE) is also called Core-Clad Eccentricity and measures how well the core is centered in the fiber. CCCE is measured in microns and, of course, the closer the core is placed to perfect center, the lower the CCCE value and the better it is. Again, core aligning splicers tend to give the lowest splice losses.
Clad non-circularity is also called Cladding Ovality and measures a fiber’s deviation from being perfectly circular – becoming an oval rather than a circle. It is measured as a percentage difference between the “long” and the “short” diameter of an oval. If the Cladding Non-Circularity is zero the cladding forms a perfectly round circle. Similar to other fiber properties, better cladding non-circularity can result in improved splicing and connectorization performance.
While coating specifications are not as stringent as glass specifications, they are also important – especially when fibers are used in ribbons. The two main parameters are Coating Diameter (Uncolored) 237 – 247 μm and Coating-Clad Concentricity Error of max. 0.5 micron.
For roughly the first 30 years of single-mode fiber manufacturing, a coating nominal diameter of approximately 245-250 μm was standard in the industry. However, in 2014, OFS launched a 200 μm fiber in response to the need for higher fiber density in fiber optic cable designs.
Although the difference between 200 and 250 μm is not tremendously large, smaller diameter fibers can enable twice the fiber count in the same size buffer tube in a cable, while also still preserving long-term reliability. This fact has led to many new compact cable designs, including extremely small micro-cables, loose tube duct cables and all-dielectric, self-supporting (ADSS) aerial cables. As the demand for higher fiber density continues to increase, we can expect to see even more cable designs taking advantage of smaller diameter coatings. Important though, is that the coating will still be able to sufficiently shield the fibers from micro-bending, which may otherwise cause increased losses in the fiber when the fiber is inadvertently being “squeezed” in the cable – especially with low temperature.
Another possibility is to make the glass fiber itself less sensitive to such potential problems – so it’s not just a simple task of reducing the thickness of the fiber coating, but obtaining a sufficiently good fiber performance too.
Besides inherent size, coating diameter control is extremely important. Coating diameter can affect the size of the overall bundle in fibers. If the coating is too thick, the overall bundle may incur strain sooner than expected. If, on the other hand, coating concentricity is not good, there can be additional concerns particularly when splicing ribbons.
The final parameter we will discuss is fiber curl.
Fiber curl assesses the non-linearity of bare glass. In other words, fiber curl measures how straight the glass fiber is when no external stressors are present. If imbalanced stresses are frozen into a fiber during the draw process, curl can result. This curl can show up during the splicing of fiber optic ribbons or when fixed V-groove splicing machines are used.
If curl occurs, the two ends of the fiber will not be straight or match up during the splicing process. This situation leads to both high losses and difficulty splicing. Curl is measured in meters of curl, with a typical specification being > 4m. When optical fiber comes out of the fiber draw, it is annealed during the manufacturing process to reduce the effects of curl. As a result of this process, for users of top-quality fiber, fiber curl poses no concern for typical telecom applications.
Fiber geometry is often taken for granted by end users, primarily because it has been very good for so long. However, it has taken hard work and the contributions of innumerable people over many years for fiber geometry quality to reach its current level.
Article 1 in this series focused on growth in bandwidth demand and attenuation in optical fibers caused by bending, and the built-in mechanisms scattering and absorption. Article 2 focused on several types of dispersion that exist in fiber.
This article, the third in the series, will focus on fiber strength and reliability.
Except for products specifically intended for short time use only, lifetime and durability are normally of rather high importance for the user. As for optical fibers, the installation of the cabled fibers is typically both time consuming and very costly, and so reliability and lifetime may be even more important.
Historically this has resulted in a lot of focus on fiber lifetime, and although materials for optical fiber manufacture have been under constant improvement over the last 30 – 40 years, the longevity of even some of the first fiber optic cables is impressive.
Unlike most other fiber properties, we are here dealing with statistical properties which are difficult and time consuming to measure, and the exact lifetime of an individual fiber cannot be predicted.
As such this is not different from lifetime considerations for other products, but failure consequences may typically be more severe.
Fiber has a reputation for excellent reliability, and that hasn’t been acquired by luck. For most end users, fiber just works, and they don’t give the topic much thought. However, this performance is based on a deep understanding of the mechanical performance of the glass to the molecular level. Researchers have spent entire careers acquiring this understanding and translating this knowledge to product design, manufacturing, and installation recommendations.
WHY DO FIBERS BREAK?
For starters, pristine glass optical fiber is stronger than steel, when measured diameter to diameter. Although the actual intrinsic strength of glass can vary according with sample preparation, it is often greater than 700,000 pounds/square inch (PSI), compared to 70,000- 85,000 PSI for cold rolled steel.
Although glass is very strong, it is brittle so it cannot be stretched for large amounts without breaking. Imperfections and mechanical flaws may weaken the glass significantly as may impurities even as small as fractions of a micron.
In order to limit the number of impurities to an absolute minimum, OFS use synthetic silica to make our glass. Synthetic silica (SiO2) is hence made from ultra-pure chemicals which are pure to the parts per billion level, and the actual core of the fiber is pure to the 10 parts per trillion level. It would resemble allowing only one bad rice-grain in a 20 ft standard shipping container filled with good rice. By keeping such an extremely high glass purity, the risk of fiber weakening because of impurities is minimized. Other manufacturing methods may not provide similar high purity of the finished glass fiber.
But impurities are not the only reasons that may cause a fiber to break. A glass surface typically contains a number of cracks of different sizes. This is true for fiber glass surfaces as well, and when stress is applied to the fiber, the cracks will grow larger. As might be expected, a large stress on the fiber will cause fiber cracks and other flaws to grow more rapidly than if only a small stress is applied to that fiber.
There may be a huge difference in the speed by which the cracks grow, and this may not always seem obvious. More on that later.
Another issue which may not seem obvious is that stress is also applied to a fiber simply by bending that fiber. As shown in Figure 3 the outermost surface of a fiber bend will be stretched and result in tension – whereas the innermost surface will be compressed. Compression does not cause cracks in the fiber surface to grow larger – but tension does.
It probably seems more obvious that tension applied along the length of the fiber (typically by pulling the fiber) will also cause cracks and flaws to grow.
When cracks or flaws grow larger the strength of the fiber is reduced because a smaller cross section of the fiber is now holding the fiber together. So, a fiber with large cracks or flaws will break at a lower strain than a pristine fiber containing only small (or no) cracks.
To ensure that no large cracks or flaws are present in the fiber, a manufacturing step called “proof testing” is added after the actual making of the fiber. During proof testing, every 1 – 2 meter section of the fiber is subjected to a relatively large strain. So large that it makes the fiber 1% longer. Under such large strain cracks and flaws in the fiber will grow quickly, and if the cracks are large enough the fiber will simply break. This ensures that after proof testing no large cracks or flaws exist in the fiber anymore. However, small cracks and flaws will still be present and that is why the glass production method is so important. The purer the glass, the less likely problems will occur long term.
The proof test itself is simple. Fiber is typically guided around a pulley with a 1 kg weight attached to it for 100 kpsi proof test (Figure 6).
Fibers for use in submarine cables are normally proof tested even more stringently (2% elongation) to ensure even better lifetime performance.
An undamaged fiber which is proof tested 1% (100 kpsi) and used in standard applications and cable constructions will have a very long lifetime – typically 40 years or more. But fibers may get damaged. If that happens before proof testing, the fiber will break and the problem is eliminated.
However, damage to the fiber after proof testing may happen on rare occasions. Fibers are wound on spools for easy and safe handling and transportation and if the wound fiber “package” is somehow affected by a mechanical impact – as for example if the fiber spool is accidently dropped and hits the corner of a table – this may create large cracks in the fiber surface, and the fiber may suddenly not possess the expected strength anymore.
Likewise, the actual glass surface of the fiber may be “scratched” – potentially leaving large cracks in the fiber surface. Such scratches may have different causes but probably the most obvious of those causes is that the tool used to strip the coating of the glass fiber may inadvertently inflict scratches on the fiber surface. Also, sharp small objects may penetrate the fiber coating if the fiber is being forced against them – again possibly scratching the fiber glass surface.
In “loose tube” cable constructions fibers are placed in plastic tubes, which are helically wound around the cable-core. Such a helical construction ensures that cables may be pulled – and thereby elongated – quite a bit during installation before strain is transmitted to the actual fibers. The effect is the same as seen in a classical “spiral” telephone handset cord which can be extended quite a long way without putting additional strain on the copper wires within the cord.
Most cable constructions are similarly able to “absorb” significant strain on the cable before transmitting any additional strain to the fibers themselves, and in general such a “maximum pulling force” is always specified for cables – although different names may be used for it.
Nevertheless, a small risk does exist if for example heavy machinery stretched sections of a cable so much that the actual fibers were unintentionally subjected to excess tension, leading to fast growth of cracks in the fiber surface. As a result, the fiber lifetime may have been reduced – possibly without leaving traces on the cable or the fiber themselves. With modern equipment, training and knowledge, fiber damage problems are, however, highly uncommon.
In some situations, thermal variances may result in the movement of cables relative to the closures, and that may put the fibers under additional tension from time to time.
Because of helical structures present in many cable constructions the fiber is effectively situated in large diameter bends along the full cable length. Furthermore, it is common practice to coil a small amount of excess fiber in 50 mm or 60 mm coils in splice cassettes. So, in reality, the fiber is almost always under constant – but small – stress during its operational lifetime. Since the bending diameters are typically large the resulting lifetime for an un-damaged fiber is still very long. The breaking risk of a high-quality fiber under such conditions is therefore negligible. A very important feature for cables carrying signals to thousands of users.
For a damaged fiber, however, a rather puzzling situation may arise. The fiber may be fully installed, control measured, and all parameters may appear to be fine. However, because of damage the fiber may contain rather large cracks, which may propagate and hence weaken the fiber. Even with commonly used cable constructions and splice cassettes, damaged fibers may break rather quickly. It’s possible that fiber breaks can start appearing months after installation when everything appeared to be in order at the time of installation and even when nobody has actually touched the fibers since.
Some fiber cables are specifically intended for applications where small bending diameters are required. This could be drop cables to cell towers or 5G small cells – and in particular cables intended for installation in homes and apartments where it is important to be able to route the cable closely around door frames, windows and ceilings in order to make the cable as invisible as possible. Cables as thin as 0.6 mm are available, which may be bent with a radius as tight as 2.5 mm, while maintaining a failure risk which is still much smaller than that of typical electronic consumer equipment.
In figure 8) the results of a typical lifetime test are shown. The test registers the lifetime of a high-quality fiber when that fiber is bent at different diameters. What is important to notice is that lifetime is hugely dependent on the bending radius. For the fiber in the test, the lifetime changes from 1 minute to 40 years by changing the bending radius from 1.0 mm to 1.8 mm. Although very long lifetime predictions tend to be somewhat questionable, it is obvious that a bending radius of 2.5 mm – like the one recommended for some ultra-bend insensitive fibers – will still give a very long lifetime.
However, rather than focusing on lifetime it may be much more interesting for the user to have a prediction of the number of expected failures over a reasonable period of time. For example, computer hard disk drives are often characterized by the Annualized Failure Rate (AFR) representing the expected average number of failures over one year. Some investigations have shown approximately 1% of such failures expected over a year.
To compare fibers with that, and using internationally recognized lifetime models, a fiber cable used inside an apartment with for example 12 quarter-circle bends of 2.5 mm radius will have an expected failure rate of 45 ppm over 30 years. That is 0.0045% failure risk over 30 years (not in just 1 year).
So, fiber optic cables in typical indoor home installations with bend radii as low as 2.5 mm still have exceptionally less failure risks than the typical electronic household equipment which we know and use in our daily life. And since a – very rare – fiber break will affect only a few users, the possibility of having almost invisible cable installations using thin cables and 2.5 mm radius bending outweighs the small risk of failure for many users.
As a quick refresher, article 1 in this series focused on growth in bandwidth demand. We also looked at attenuation in optical fibers caused by factors external to the fiber e.g. bending, and built in attenuation mechanisms i.e. scattering and absorption.
In this second article, we will focus on the several types of dispersion that exist in fiber.
DISPERSION – WHAT IS IT?
Much, but not all, of the traffic traveling through fiber networks takes the form of pulses of laser light. Such a pulse is created by turning a laser on and off, creating light pulses where “no light” represents a digital “0” – and “full light” represents a digital “1”. Digital information is consequently a series of “no light” and “full light” transmitted in a code which a receiver at the other end of the fiber understands and can convert to a digital electrical signal.
Illustrating such a signal would be a series of square pulses as shown in Figure 1.
Whenever such a signal is affected by dispersion, the edges of the square pulses will be rounded, and the pulse will be spread out over time. So dispersion broadens the pulses.
If the dispersion is small, the detector at the other end of the fiber will still be able to detect the signal correctly. Once the dispersion grows too large, the broadened pulses will overlap each other and the detector will start misreading the signal, creating errors that will effectively hamper the transmission quality. A measure of that quality is the BER (Bit Error Rate) which states the number of transmission errors relative to the total number of transmitted bits.
Since a faster transmission rate requires pulses to be of a shorter duration, this also means that a given level of dispersion will be more harmful to faster transmission rate signals. Furthermore, dispersion is almost always dependent on the fiber length – the longer the fiber, the greater the dispersion.
Hence transmission is limited by: A) The dispersion of the fiber B) The transmission rate, and C) The length of the fiber. Dispersion can be described as a “speed limiter”- and the 3 main types are:
Modal Dispersion, Chromatic Dispersion and Polarization Mode Dispersion.
Modal Dispersion is the most serious of the dispersion types, and hence the most severe “speed limiter”.
Light “modes” are different types of waves carrying the light through the fiber. In a “Multi Mode” fiber, the core is rather large and may typically allow up to 17 different modes to propagate. In a “Single Mode” fiber, the core is so small that it will allow only one mode to propagate.
The problem is that the different modes follow different paths through the fiber – and these paths are of different lengths. Some modes travel close to the center of the core – others bounce against the outer edges of the core, and these modes travel a longer way than the ones close to the center. So the different modes travel different distances – and hence some tend to travel faster than others. Parts of the light being injected into the fiber will travel via one mode – other parts via another mode – and so on. If nothing is done to mitigate this, parts of the input signal will arrive at the output later than other parts – and this will cause the output signal to be “dispersed” relative to the input signal as illustrated in Figure 1.
To try to minimize the dispersion of the signal to the output of the fiber, the fiber core of a multimode fiber is designed to delay the Light Modes travelling close to the core (which is the shortest distance) and to speed up the modes travelling the longest distance. In a perfect world this would result in all modes bringing light simultaneously to the output of the fiber. Alas the world is less than perfect, and as such a bit of Modal Dispersion cannot be avoided in real life.
This means that, even though Multimode fibers are able to use very price efficient light sources (like LEDs or VCSELs) they are still limited to transmission distances of typically less than 2 km, actually often less than a few hundred meters.
The way to avoid Modal Dispersion is to shrink the size of the fiber core. In a small fiber core there is only room for one light mode to exist, called the Fundamental Mode. In such single-mode fibers, higher order modes may indeed be generated at splices or connectors, but they will leak out of the fiber after traveling a short distance through the fiber.
Having now found a way to avoid the most important speed limiter we can turn our attention to the next in line.
Chromatic Dispersion means that light of different wavelengths travel with different speeds along the fiber. Again, such a difference results in the “blurring” of the signal on the output side of the fiber and effectively acts as a speed limiter.
One might wonder why this should be such a problem, since lasers used to inject light into the fiber have very precisely defined and stable wavelengths. However, quickly turning a laser light on and off actually by itself generates a number of new wavelengths close to the original laser wavelength. Most of these new wavelengths are luckily quite weak and will not cause problems – but unfortunately as the laser light is turned on and off ever more quickly, the range of generated wavelengths broadens (Figure 5).
In such transmission systems the problems caused by Chromatic Dispersion worsen with increasing transmission speed and with longer fiber lengths (scaling linearly with fiber length).
Trying to minimize problems with Chromatic Dispersion the “Dispersion Shifted” (ITU-T G.653) fiber type was initially developed. In classical standard single-mode (ITU-T G.652) fibers the Chromatic Dispersion is zero around 1310 nm. The Dispersion Shifted fibers were targeted for the Chromatic Dispersion to be zero around 1550 nm, because the attenuation of the fiber is lower at 1550 nm and so this combination seemed ideal.
Basically, this worked fine right up until DWDM arrived. In DWDM systems a number of individual channels are transmitted over the same fiber. Each channel is assigned a unique wavelength, but unfortunately the fiber non-linearity called Four Wave Mixing (FWM) tends to cause unwanted noise problems in DWDM systems if the Chromatic Dispersion in the fiber is very low.
So realizing that some level of Chromatic Dispersion is preferable in order to limit fiber non-linearity problems in DWDM systems, Non-Zero Dispersion Shifted fiber (ITU-T G.655) was developed. This fiber type has a small amount of Chromatic Dispersion around 1550 nm (significantly smaller than standard G.652 fibers) so the “speed limitation” is smaller – but still the Chromatic Dispersion is high enough to reduce non-linearity problems very significantly. Later the G.656 Non- Zero Dispersion Shifted fibers were developed as a response to the demand for an increasing number of channels in DWDM systems. When the number of channels go up, the individual channels need to be packed more closely together – and that in turn requires more Chromatic Dispersion in the fiber to reduce the effect of the Four Wave Mixing.
In parallel with the development of new fiber types with different Chromatic Dispersion characteristics, special devices with negative Chromatic Dispersion were developed. Since transmission fibers normally have positive Chromatic Dispersion, a combination of those two can be used to reduce total Chromatic Dispersion for a full fiber link to almost zero.
With the ability to reduce the total chromatic dispersion of a transmission link, the higher Chromatic Dispersion of the G.656 fibers was consequently an acceptable technical compromise – leaving only cost issues still to be considered.
In many of the recent high-capacity transmission systems, the Chromatic Dispersion of the transmission fiber is compensated electronically with high efficiency, and for such systems fibers with high Chromatic Dispersion may actually be advantageous because it helps to limit fiber non-linearities.
Just to make confusion complete, a single-mode fiber will actually be able to carry TWO versions of the fundamental light mode. The reason for this is that light may exist in two different polarizations, the modes of which are perpendicular to one another. The phenomenon is known from some sunglasses which cut away one of those polarization modes. Reflected sunlight from the sea surface or a wet road will predominantly consist of light in one of these polarization modes – whereas light reflected by other objects will consist of a mixture of the two polarization modes. Cutting away the polarization mode of the reflected light will “kill” the reflections, but let the other polarization mode pass through the glasses, leaving other objects visible.
In an optical fiber, the two polarization modes will both exist, but may travel at different speeds through the fiber. Such speed-differences will arise if the fiber core is not perfectly circular and if stress is present in the fiber. Stress can be “frozen” into the fiber during manufacturing if the fiber geometry is not absolutely perfect, for example, if the cladding or coating is not circular, or if the center of the core is different from the center of the cladding or coating.
Even using state of the art, high-quality manufacturing process, the fiber will not be geometrically 100% perfect, hence there will be a speed-difference between the two polarization modes, dispersion will result, and it may limit high speed transmission through the fiber. Even if the fiber were 100% perfect, the slight bending of the fiber in a cable would introduce stress in the fiber – creating PMD. So this is our third speed limiter.
Looking at a fiber from a “PMD-perspective” it may be thought of as having a “fast” and a “slow” lane. An effective way of reducing PMD is by twisting the fiber back and forth during manufacturing so that a high number of shifts between the “fast” and “slow” lanes are effectively seen by the light travelling through the fiber.
Because stress is an important cause of PMD, externally applied stress will also affect fiber PMD. In reality just holding a fiber between two fingers may change PMD. As a result, the PMD of a fiber may be affected both by the cabling of the fiber and by external stresses, for example vibrations from a nearby railroad.
As with other dispersion types the effect of PMD increases with increased transmission distance (PMD scales with the square root of the distance) and increased transmission speed. For transmission rates of 2.5 Gbps and smaller, PMD is normally not a problem. For very high transmission rate systems, the compensation of PMD is today made electronically and built into the transmission system.
The fiber optic cable world has come a long way over the past 30 years. Products have become more rugged and user friendly, making it easier for people to enter the industry and work handling optical fiber and cable. While this is great for the industry, many people may understand the “how to” but not necessarily the “why” of fiber optics. To understand the “why” behind fiber and cable products, the next step is to become a full-fledged “fiber geek.” Because the industry changes so quickly, this is an ongoing process. The purpose behind this series of articles is to enable the reader to understand some secondary fiber specifications and their importance to the network.
Once fiber is deployed, it’s very expensive to replace. For this reason, the fiber that’s installed should be capable of withstanding multiple generations of hardware while also having plenty of room for additional wavelength growth.
The graphic on the right highlights how wavelength usage has grown over the past three decades. For the first 30 years, applications were focused in the 1310 nm and 1550 nm regions. Given the explosive demand for bandwidth (even more so since COVID-19), it’s reasonable to assume that the next 30 years will require many more wavelengths, with potential applications across the entire optical spectrum.
The demand for bandwidth is expected to continue far into the future, driven in part by requirements for breakthrough applications such as higher resolution video, virtual reality and other applications. We expect this demand to continue to drive the need for optical spectrum provided by fiber. Fiber recommendations such as ITU-T G.652 and ITU-T G.657, are very important for network designers in setting minimum performance levels, but can ultimately be insufficient to meet the requirements for future networks. For this reason, performance beyond the standards can be very important.
This article will focus on critical optical parameters starting with attenuation, or loss in the fiber. Attenuation is a very important optical parameter, and there are many aspects to it. Additional articles in this series will focus on other optical parameters, including chromatic and polarization mode dispersion, splice loss, and an introduction to non-linear effects.
Keeping a low fiber attenuation has always been a focal point in fiber development – and today even more so with the widespread use of Coherent Transmission systems. These require large core and ultra-low loss attenuation fibers (typically ITU-T G.654 fiber types) for optimal performance of 100G and faster transmission systems.
Attenuation is typically measured in terms of optical dB. It is a logarithmic measurement where the Loss of a fiber equals 10*log (“Power at the- input side of the fiber” / “Power at the output side of the fiber”). Basically every 3 dB of loss corresponds to the optical power being cut in half. It is fair to assume, that the attenuation of a fiber is almost constant over the length of the fiber. So if a fiber loss is 0.25 dB/km, a total loss of 3 dB will be reached after 12 km of fiber has been passed by the optical signal in the fiber.
Looking at the different loss mechanisms in fibers, it may be helpful to distinguish between:
A): Attenuation caused by factors external to the fiber (as for example bending), and
B): Built in attenuation mechanisms.
Looking at B) first, there are two main loss mechanisms in optical fibers: Scattering and Absorption.
Also called “Rayleigh scattering”, even the best and purest, synthetic quartz glass (of which OFS fibers are made) is not 100% homogeneous. They consequently contain small fluctuations of glass density, which are frozen into the glass during manufacturing and may scatter the light when hit by a light ray (this is the same mechanism responsible for the blue color of the sky, when sunlight scatters off molecules in the atmosphere). Much of the light will continue traveling in the original direction, but a small part of the light will be scattered in all directions. Some light will propagate sideways out of the fiber, where – for transmission purposes – it will be lost. Some of it will actually be scattered backwards towards the sender. This is the phenomenon used by OTDR measurement devices to measure fiber attenuation, so the device only needs to be connected to one end of the fiber.
In optical fibers scattering is dominant at shorter wavelengths whereas the opposite is true for the other built in attenuation mechanism: Absorption (Figure 4).
Basically absorption happens when a light ray hits something – and gets converted into heat. So for practical purposes the light simply “disappears”.
Even extremely small impurities – down to a fraction of a micron – may absorb light, causing unwanted attenuation. It may be small particles – but it may also be impurities in the raw materials used for fiber manufacturing. This is why such extremely close attention is paid to the quality and purity of the raw materials used.
Due to the inherent material structure of glass, Absorption increases rather drastically at wavelengths longer than approximately 1550 nm (Figure 4)
Of particular interest over the years has been the hydroxyl (OH-) ion, which absorbs light around 1383 nm, giving rise to the so-called “water-peak” in the attenuation curve for the fiber (figure 5 – black curve). Being a by-product of the actual manufacturing process, it is difficult to fully avoid the presence of hydroxyl ions in the fiber, but it is possible to pacify the attenuation increase at the wavelengths close to 1383 nm. This is done by adding deuterium gas which interacts with the free bond of the hydroxyl ion thereby acting as a barrier securing excellent long-term Water Peak attenuation performance.
Conventional single-mode fibers meeting the G.652 recommendation may have a high Water Peak loss. This could limit the use of the fiber in some applications and may also make the fiber less useful in transmission systems using modern Raman amplification, where amplifier laser-pumps would typically operate 110 nm below the transmission signal wavelength.
OFS have fibers classified as Zero Water Peak (ZWP) with even better specified Water Peak performance than the so-called Low Water Peak (LWP) fibers. The long-term stability of ZWP fibers is excellent whereas for some types of ITU-T G.652 fibers the water peak attenuation might actually increase over their lifetime, slowly reducing the quality of the network.
Because of the optimized Water Peak performance, ZWP fibers serve the widest ranges of wavelengths and support the highest number of applications, as illustrated in Figure 1.
Figure 5 shows three different grades of ITU-T G.652 fiber, and how they may be performing in the water peak region around 1383 nm.
For the most part, scattering and absorption properties are locked into the fiber during manufacturing.
Bending, however, is another story…
Bending is a very important mechanism. As briefly mentioned, it is caused by factors external to the fiber and so both the cabling process and installation in the field can affect attenuation caused by bending.
To put it simply, what makes an optical fiber work is the use of different types of glass for the fiber core and for the glass surrounding the core (also known as the cladding). In this way, a sort of a tubular mirror surrounding the core is created. This is what keeps the light inside the fiber, using the concept of “total internal reflection” to guide the light. However, this mirror is not a perfect one. It only works if the light rays in the fiber run almost parallel to the core, and so if the fiber is bent (too) tightly (i.e. past the “critical angle” when reflection turns to refraction), light will leak out of the fiber causing loss – or attenuation.
This is called macro bending, where the diameter of the bending is larger than a few millimeters, which is what one would intuitively understand as “bending” the fiber.
Another type of bending is called micro bending. It concerns bending diameters smaller than 1 mm and could happen – as an example – if a fiber is squeezed between two sheets of sandpaper. Much more relevantly it may also happen if the fiber is being squeezed inside the cable construction (for example by the tubes containing the fibers) creating stress on the fiber. As loads/stresses increase, so does the loss.
Both types of bending loss cause attenuation increase, but it is possible to tell the two types of bending apart by considering the added loss at different wavelengths as illustrated in Figure 8.
Macrobending losses tend to be small at short wavelengths, but may increase rather dramatically at longer wavelengths.
Microbending losses are also typically present at short wavelengths, but the loss increase tends to be smaller than for macrobending at the longer wavelengths.
All of the trends in fiber deployment point to the increased importance of fiber bending performance.
Service providers constantly want to put more fibers into a smaller space which means that while buffer tube diameters keep shrinking, the fiber counts used in these buffer tubes keep increasing. This leads to a situation where there is less room for fibers to move before touching a buffer tube wall, thereby increasing the risk of microbends.
In addition, service providers primarily installed cables in either the outside plant, the inside of central offices, or into remote cabinets. Everywhere great care was taken to avoid small diameter bending. However, today’s fiber is going to places where it hasn’t gone before. It’s going inside our homes and businesses and also up poles and onto rooftops to feed cellular and Wi-Fi sites.
Tolerance to bending will be even more important in the future.
Micro and macro bends affect the network in ways that are not always obvious.
Bend-related losses are sometimes experienced in cold temperature environments. For this reason, fibers and cables should always be tested under low temperature conditions. As a network designer, it’s always a good idea to account for at least some optical margin for small potential attenuation increases in cold temperatures.
Especially very high-density designs may benefit from using bend insensitive fibers due to the unavoidable bends and lack of free space for fiber movement in the cable design itself.
While these issues are already important today, they will become even more important tomorrow. The reason is that next generation optical transmission protocols may typically use longer wavelengths than the existing protocols.
As highlighted earlier, longer wavelengths will often result in higher bending loss. Theoretically, a GPON network operating flawlessly today at 1490 nm – containing inadvertent bends – could have its reach reduced by almost half when it is upgraded to NG-PON2, operating at 1603 nm.
So a FTTH network installed today and working fine may not be suited for operation with future generation transmission equipment.
HELP IS ON THE WAY
In order to enable more compact cable constructions and allow for easier installation and perhaps even allow for the use of less experienced craftspeople for cable installation, quite a bit of attention has recently been focused on developing fibers with reduced sensitivity to bending i.e. those defined by the ITU-T Recommendation G.657.
G.657 specifies 4 different classes of fibers: “A1”, “A2”, “B2” and “B3”.
The “A” fibers are required to also fulfill (or to comply with) the specifications of the ITU-T G.652.D recommendation, whereas the “B” fibers may deviate from G.652.D on some parameters. The numbers (1, 2 or 3) signifies the fibers tolerance to bending – B3 fibers being most bending tolerant. Many “B3” fibers do today comply with G.652.D and should rightfully be labelled: “A3”, but such a class is not specified by ITU-T.
ITU-T G.657.A1 fibers are the closest to standard G.652.D fibers and may soon be the primary choice for the vast majority of fiber networks. OFS has combined G.657. A1 and G.652.D performance with a 9.2-micron mode field diameter.
G.657.A1 fibers with 9.2-micron mode field diameter perform the same way as standard G.652 fibers in terms of splicing – and can consequently be said to splice “seamlessly” to the huge base of already installed fibers. By offering the same splice performance as standard G.652 fibers, installation crews and quality inspectors will notice no change in performance and hence be given no cause for concern – even
though the advantages of better tolerance to bending will still be there.
These fibers are ideal for most of today’s typical short-distance (<1000 km) and low data rate (<400Gbps) applications, including standard outside plant (OSP) loose tube, ribbon, rollable ribbon, microduct cables, and drop cables.
ITU-T G.657.A2 fibers can be bent more tightly with lower loss. They are most commonly used in central office and cabinet environments, such as Fiber Distribution Hubs (FDH). These fibers are also commonly used in building backbone networks and as tails for various pre-terminated panels and other devices. In these environments, the fibers may need to be bent more tightly than in typical OSP cable applications.
The application spaces just mentioned for A1 and A2 fibers would typically involve one fiber to carry traffic for thousands of customers, meaning that a fiber break would affect the service to thousands of users. Here reliability is consequently paramount. In such situations A2 fibers (and A1 as well) offer the advantage of providing an “early warning” signal of increased attenuation whenever they are bent tightly enough to potentially cause reliability concerns. This is especially important for central office applications where one fiber could provide the feed for millions of customers.
ITU-T G.657.B3 fibers are the third main category of bend insensitive fibers. These fibers are designed and recommended for use in the drop portion of a Fiber-to-the- Home (FTTH) network serving a few customers per fiber. Homes and buildings with lots of tight spaces are very demanding places to deploy fiber. For optimized performance in such applications OFS has fiber which is designed and specified for use with bending radii as low as 2.5 mm which is significantly less than the minimum bending radius of 5 mm specified in the G.657.B3 recommendation.
OFS has fibers used in cables with a diameter of only 0.6 mm, enabling almost invisible in-house cable routing with a minimum of bending management. This avoids bulky and distasteful installation in private homes. For more demanding deployments, ruggedized cable designs with a diameter of 4.8 or 3mm may even be routed around corners and stapled using fast and easy installation practices, with negligible signal loss.
G.657 fibers which are not compliant with G.652.D are often assumed to have very small cores giving rise to significant additional splice losses when spliced to standard G.652.D fibers. However, that is not necessarily so. It is possible to get G.657.B3 fibers specified with an ultra-low bending radii of 2.5 mm and – whereas these fibers are not “seamless” fibers – they do in fact comply with the G.652.D recommendation in terms of core size. The only thing preventing such fibers from complying with G.652.D is the Chromatic Dispersion, and since they are primarily intended for in-building applications, the length will typically be much less than the 10 – 40 km fiber length in which the higher Chromatic Dispersion may typically start presenting problems.
Regarding bending loss however, the performance of such a fiber is significantly better. The loss for a single turn of 2.5 mm radius at 1550 nm for such a fiber is max. 0.2 dB – whereas the similar loss for a standard G.652.D fiber exceeds 30 dB.
for 50 and 62.5 µm GiHCS®, 200 µm HCS® LC Connectors
Important Safety and Warranty Information
Please Read First!
Please make sure to READ and understand the termination instructions completely. Improper assembly will cause poor termination results and cause damage to termination kit components.
Make sure you WEAR eye protection during the termination process. Bare optical fiber is sharp and may splinter; handle very carefully and make use of the provided fiber optic shard disposal container.
For more information please CONTACT the sales representative in your region or call the factory for technical support:
Mon-Friday, 8:00 am-5:00 pm EST.
770-798-5555 [Outside the USA and Canada]
LC Termination Kit Contents
Related Products and Accessories (Sold Separately)
LC and LC Duplex Connectors
Insertion Loss Test Kit
Slide STRAIN RELIEF BOOT (tapered end first) onto cable end and slide approximately 3 inches [76 mm] out of way.
STEP 2: Remove Outer Cable Jacket
Mark cable outer jacket 2.5 inches [63.5 mm] from cable end with a marker
Using 2nd hole (marked 1.6) from the open side of the cable jacket strip tool, remove the 2.5 inches [63.5 mm] of outer jacket.
STEP 3: Remove ETFE Buffer
Insert the buffered fiber through the guide tube of the ETFE Buffer Strip Tool, all the way in until the cable jacket bottoms out inside it.
Holding cable securely, squeeze the tool’s handles to cut ETFE buffer then PULL STRAIGHT to remove the ETFE buffer.
With alcohol prep pad folded in two, wipe the surface of fiber where the ETFE buffer was just removed.
STEP 4: Install Connector Body
Locate the CONNECTOR BODY subassembly into the CRIMP TOOL nest as shown. Close the crimp tool handles lightly to secure connector in nest, but do not yet apply crimp
Insert stripped fiber into CONNECTOR BODY subassembly until the cable jacket bottoms out inside the connector
Squeeze handles of CRIMP TOOL to apply crimp. CRIMP TOOL will not release until fully crimped.
Remove CONNECTOR from CRIMP TOOL nest. Slide up and
install BOOT onto CONNECTOR.
Locate the CONNECTOR BODY subassembly into the CRIMP TOOL nest as shown. Close the crimp tool handles lightly to secure connector in nest, but do not yet apply crimp
Insert stripped fiber into CONNECTOR BODY subassembly until the cable jacket bottoms out inside the connector
Squeeze handles of CRIMP TOOL to apply crimp. CRIMP TOOL will not release until fully crimped.
Remove CONNECTOR from CRIMP TOOL nest. Slide up and
install BOOT onto CONNECTOR.
STEP 5: Cleave Optical Fiber
Holding the CLEAVE TOOL in a horizontal position, grip the handle while leaving your index finger free to actuate trigger
Gently insert CONNECTOR BODY into cleave tool as shown. Be sure to have it fully inserted and release the CONNECTOR BODY
Using index finger, slowly depress trigger to perform the cleave operation. The cleave process is complete when the optical fiber snaps away
from the connector. Do not release the trigger just yet!
Before releasing the trigger, remove CONNECTOR BODY from the
cleave tool and grasp the optical fiber scrap while releasing the trigger.
Gently remove the scrap fiber from the cleave tool while keeping it away from the tool’s diamond blade. Place the scrap optical fiber into the fiber optic shard container for safe disposal.
STEP 6: Install Anti-snag Latch or Duplexing Clip
• Spread the clip slightly as shown.
• Install the clip around the connector, aligning as shown.
• Wrap around and snap on to secure.
• Spread the clip slightly as shown.
• Install the clip around the connectors, aligning as shown.
• Wrap around and snap on to secure.
The PDF document also includes a Cleave Tool Cleaning Guide:
For cleaning your cleave tool, please order the OFS Cleave
Tool Cleaning Kit (part #P16247) which includes recommended cleaning fluid, swabs, and complete instructions.
The PDF document also includes a fiber optic troubleshooting guide for:
There are three main loss mechanisms in fibers, and we’ll briefly discuss each. Those Attenuation mechanisms are:
The first mechanism is “Rayleigh scattering” of light in fiber. This mechanism contributes most to the baseline attenuation of fiber. A certain amount of light is scattered in the glass. In the simplest of terms, scattered light is simply light that is no longer guided through the optical fiber, but instead propagates in some other random direction (an interesting side note is that OTDRs measure loss by using the light that is scattered backwards in a fiber so the device only needs to be connected to one end of the optical fiber). Since some light doesn’t transmit forward through the glass, loss occurs. The classic attenuation curve has a relationship of attenuation proportional to 1/λ4 and is driven by the properties of Rayleigh scattering. Rayleigh scattering is the result of small fluctuations of glass density in an optical fiber and is the same mechanism responsible for the blue color of the sky, when sunlight scatters off molecules in the atmosphere.
The scattering-related attenuation properties of the glass are determined by the materials used in the glass, and are frozen in during fiber manufacturing.
Impurities may absorb or reflect light. This is why fiber manufacturers pay such close attention to the quality of materials used in the glass and to cleanliness during manufacturing. Particles as small as a fraction of a micron can be large enough to absorb enough light to increase loss.
Besides particles, impurities in the raw materials used in the fiber manufacturing process itself can increase loss. That’s because the hydroxyl (OH) ion is a by-product of the manufacturing process. It absorbs light in the wavelength range around 1383 nm.
The graph to the right shows the loss performance across the wavelength range with three different grades of fiber.
Bending is a very important mechanism. The cabling process and installation in the field can affect attenuation caused by bending.
Let’s go back to Fiber 101. Fibers use the concept of total internal reflection to guide light. The refractive index profile of the core and the cladding determine how light is guided, and the term “critical angle” is used to describe when reflection turns to refraction and light is lost from the fiber. In short, when fiber is bent tightly, light can be lost.
There are two main modes of bending – macrobending and microbending.
While the end result of both types of bending is attenuation, the mechanisms and how they manifest differ.
Other Ways Microbends and Macrobends Show Up in the Network
The concepts of micro and macro bends affect the network in ways that are not always obvious.
Bend-related loss is also sometimes experienced in cold temperature environments. For this reason, fiber and cable qualifications should always include tests to see how products perform in cold temperatures. As a network designer, it’s always a good idea to account for at least some optical margin for small potential attenuation increases in cold temperatures.
Help is On the Way
The good news is that fiber manufacturers have developed fibers that can withstand different amounts of bending while also reducing loss compared with traditional fibers meeting the ITU G.652.D Recommendation. These fibers are called bend insensitive or bend optimized fibers, and are defined by ITU Recommendation G.657.
For the network designer and installer, a thorough understanding of various attenuation mechanisms can assist with the network planning and installation processes, enabling proper loss budgeting and the use of appropriate products for the application.
In most situations, attenuation is the most important network parameter, and this article has provided enough background for you to be well on your way to fiber geekdom on this topic. However, fiber geekdom is a journey, not a destination, and there’s always more to learn. OFS has multiple decades of experience with fiber optic networks. Please contact your local OFS representative if you would like additional information regarding any of the items in this article.
Mark Boxer is Technical Manager, Solutions and Applications Engineering for OFS. In this role, he assists customers deploying fiber in a wide variety of network design scenarios around the world and analyzes trends in telecommunications markets that drive new product innovation. Mark has a BME degree from Georgia Tech, and has spent his 30+ year career in the fiber industry. His experience includes varied roles in manufacturing and applications engineering for fiber-based products and markets. Other activities include inventor of six US Patents, member and past Secretary of the IEEE Power Engineering Society Fiber Optic Working Group, contributing member to the Fiber Broadband Association (FBA) (formerly FTTH Council) Technology Committee and Board of Directors member of the FBA.
This document covers cable placing in conduit, innerduct, handholes, and manhole structures. The innerduct may be direct buried or placed in larger diameter conduits. In some applications, the innerduct may be lashed to an aerial strand.
This document covers conventional cable placing techniques that are used to pull or blow (cable jetting) the cable into the conduit or innerduct.
It is recommended that an outside plant engineer conduct a route survey and inspection prior to cable installation. Manholes and ducts should be inspected to determine the optimum splice locations and duct assignments. A detailed installation plan, including cable pulling or blowing locations, intermediate assist points, and cable feed locations should be developed based on the route survey.
2.2 Maximum Rated Cable Load
The maximum rated cable load (MRCL) for most OFS outside plant fiber optic cables is 600 lb; however, the cable documentation should always be checked because lower values of MRCL may apply for some cables. When using pulling equipment to install cable, measures should be taken to ensure that the MRCL is not exceeded. This includes the use of breakaway swivels, hydraulic pressure relief valves, and electronic tension control systems.
2.3 Minimum Bend Diameter
The minimum bend diameters for OFS cables are defined for both dynamic and static conditions. The dynamic condition applies during installation when a cable may be exposed to the MRCL, e.g., while pulling the cable around a sheave or capstan.
2.4 Temperature Limits
Storage and installation of OFS fiber optic cable is limited to the temperature ranges. Be aware that solar heating due to sunlight exposure can increase the cable temperature well above the ambient temperature.
3. Underground Optical Cable Precautions
Before starting any underground cable placing operations, all personnel must be thoroughly familiar with local company safety practices. Practices covering the following procedures should be given special emphasis:
Fiber optic cable is most often placed in a small-diameter innerduct rather than a large-diameter conduit. For existing conduit structures, multiple innerducts can be placed in a single conduit to provide multiple cable paths in the duct. Innerduct is also recommended because it provides a clean continuous path for the installation of the fiber optic cable.
4.1 Diameter Ratio and Area Ratio
Diameter ratio and/or area ratio are used to determine the optimal cable OD that should be installed in an innerduct. Either ratio can be used, but consistently using one or the other is important to avoid confusion.
4.2 Direct Buried Applications
Studies have shown that vertical undulations in direct buried innerduct can greatly increase the required cable installation forces.
5. Cable Lubricant
Cable lubricant should be used when placing fiber optic cables. Recommended cable lubricants include Polywater4, Hydralube5, and similar cable lubricants that are compatible with polyethylene cable jackets. Both the winch line (or pulling rope) and the cable should be lubricated.
The backfeed technique is a common installation method that is used to divide the cable installation into two separate pulls. The backfeed technique may also be used near equipment offices when one end of the cable must be pulled by hand into the building, or at manhole locations where the cable route changes direction.
6.2 Forward-Feed Technique
In the forward-feed technique, the leading end of the cable and excess cable length are pulled out of the innerduct at an intermediate manhole and stored on the ground in a figure-eight. This technique can be used multiple times during a cable installation to greatly increase the distance between cable splices.
6.3 Figure-Eight Installation Techniques
If figure-eight techniques are used during cable installation, the cable should be handled manually and stored on the ground. Place the cable on tarps to prevent damage from gravel, rocks, or other abrasive surfaces.
When figure-eighting heavy cables (264 fibers or more), the cable stack should be offset to prevent sheath dents and cable damage. Although sheath dents do not typically damage fibers, this type of cosmetic damage is undesirable. When utilizing the offset method, each crossover point of the cable stack should be offset about 2 inches instead of being stacked directly on top of each other.
Handholes are frequently used to provide access to cable splices and slack storage coils. On long cable pulls, handholes may be used to facilitate intermediate-assist placing operations. The intermediate-assist handholes are typically installed near obstacles or at a predetermined spacing that coincides with the maximum expected cable installation length.
7. Pulling Fiber Optic Cable
The following instructions assume general familiarity with outside plant cable placing procedures. They also assume that the innerduct is in place and a lubricated pulling tape or rope has been installed in the innerduct.
7.1 Feed Manhole
Mount the cable reel on the reel carrier so that the cable feeds off the top of the reel. Position the cable reel adjacent to the manhole and in-line with the innerduct. The cable reel should be positioned close enough to the manhole so that excessive cable length is not dragging on the ground, but far enough away to maintain slack cable in the event of a sudden start or stop during the pulling operation. A distance of 10 – 15 feet is generally sufficient. Attach the pull line to the fiber optic cable using a cable grip and swivel connector. Caution: A breakaway swivel connector is required if a tension limited winch is not used to pull the cable.
7.2 Intermediate Manholes
The innerduct in intermediate manholes may be continuous through the manhole or it may be interrupted. In either case, the innerduct should be positioned in a straight path from entry duct to exit duct. If the innerduct is continuous and has been racked, remove the innerduct ties and straighten the innerduct through the manhole. If necessary, slack innerduct can be cut out using an innerduct cutter. Secure the innerduct in the manhole to prevent it from creeping into the main duct during cable placing operations.
OFS recommends the use of tension-limited winches for pulling fiber optic cable. The tension control may be accomplished by electrical, mechanical, or hydraulic methods. In any case, the tension-limiting device must be routinely calibrated as recommended by the equipment manufacturer. Cable winches that display cable tension but do not have automatic cutoff are not sufficient to protect the cable. If a tension-limited winch is not used, a breakaway swivel must be used to connect the fiber optic cable to the pulling line.
7.4 Capstan Winches
Breakaway swivels do not protect the fiber-optic cable after the cable pulling-eye passes the intermediate capstan winch; therefore, intermediate-assist capstan winches must be tension-limited. The capstan must also meet the minimum cable bend-diameters.
The capstan winches should be positioned along the cable route where the expected pulling tension will be 600 pounds or less. Proper positioning of the capstans prior to the start of the pull will eliminate construction delays caused when an unplanned intermediate capstan assist must be added to the placing operation.
7.4.3 Slack Cable Loop
During the pulling operation, a slack loop of cable must be maintained on the pull-off side of the capstan as shown in Figure.
7.4.4 Adding Intermediate Capstans
If an intermediate capstan is added during the cable pull and the cable pulling eye has already passed through the manhole, a loop of slack cable must be pulled to the intermediate manhole before cable is wrapped on the capstan.
7.4.5 Removing Cable from Intermediate Capstan Assist Winch
At the conclusion of the pull, the cable on the capstan is twist free. However, one twist per wrap will be generated in the cable if it is removed from the capstan and straightened.
8. Blown Optical Cable Installation
Cable blowing systems use high-pressure, high-velocity airflow combined with a pushing force to install the cable. A hydraulic or pneumatic powered drive wheel or drive belt is used to push the cable into the innerduct at the feed manhole. Controls and gauges on the cable blowing system allow the operator to monitor and adjust the air flow and push force that is exerted on the cable. Some cable jetting systems use a plug at the cable end to capture the compressed air and generate a small pulling force on the end of the cable.
9. Optical Cable Coiling
9.1 Coils Stored at Intermediate Holes
Many end users require that coils of slack cable be stored in intermediate manholes along the cable route. These slack storage coils are used for future branch splices or route rearrangements. It is important that the coiling method accommodates the proper coil diameter and does not introduce kinking or excessive twist into the cable.
9.2 Fold-Over Method
The Fold-Over Method is recommended for storing moderate lengths of slack cable. Form a cable bight and then twist the bight to form the first cable coil. Fold the coil over to form the second cable coil.
The teardrop coiling method is recommended for storing longer lengths of cable since it is easier to roll the cable than perform repetitive folding operations. The cable is stored twist free by rolling the cable bight in a manner similar to that used on the cable end.
9.4 Garden Hose Method
The garden hose method is recommended for large diameter cables because only one turn of cable is handled at a time. The storage coil can be formed directly on the manhole racking as each additional loop is added. Each loop can be taped in place as it is added to the storage coil. This method can be used to store any length of slack cable.
9.5 Coils Stored at Splice Locations
Slack cable must be stored at splice locations to allow for splicing. Typically, a cable length of 50 to 100 feet is required for splicing purposes; however, the actual cable length may vary depending on the accessibility of the manhole.
10. Racking Fiber Optic Cable and Innerduct
Cable racking normally begins in intermediate manholes and proceeds manhole by manhole toward each end of the cable. Slack for racking the fiber optic cable may come from either the feed or the pull manhole depending on which end is closer and the amount of excess cable that is available. The preferred method of obtaining racking slack is pulling by hand. If the cable cannot be moved by hand, a split cable grip can be attached to the cable and the cable can be pulled using a cable winch or a chain hoist. Do not exceed the maximum tension rating or violate the minimum bending diameter of the cable while pulling the slack.
Cable coils should be racked in a location where they will not be damaged, preferably on the manhole wall behind in-place cables. Do not decrease the diameter of the cable coils. If slack cable must be removed from the coil for racking purposes, remove one or more loops from the coil and then enlarge the coil to absorb excess slack. Tie the coil securely in place with plastic ties.
A German manufacturer of stone-retrieval baskets works with a U.S. manufacturer of specialty optical fiber. The result is a basket that, thanks to its coaxially integrated optical fiber, can simplify and shorten minimally invasive urological surgery.
The treatment of kidney stones has changed dramatically over the years. Instead of open surgery, today minimally invasive endoscopic-based procedures can be used. Once a stone is found, it can usually be removed using a nitinol basket. If the stone is too far up the urinary tract, fragmentation using laser energy is used to pulverize it. Pulverization is achieved by the introduction of an optical fiber to deliver the laser energy. This procedure is called intracorporeal lithotripsy.
Pulverization using laser energy may vary. Combining a long pulse duration with low pulse energy and high pulse frequency will blast the stone into dust. The small dust particles are eliminated, naturally. But the high pulse energy will cause the ambient temperature to rise and may cause damage to surrounding tissue. An alternative to pulverization is fragmentation. Fragmentation uses laser energy with a short pulse duration, high pulse energy and low pulse frequency. The resulting fragments can then be captured using a stone-retrieval basket.
Usually the stone is fragmented prior to the pieces being captured by the basket. But sometimes, depending on the location of the stone, the reverse order is necessary. In these cases, where the stone is captured and then fragmented, there is the risk of the laser energy damaging the stone-retrieval basket as well as the surrounding tissue.
The next logical development in intracorporeal lithotripsy is an instrument that coaxially integrates optical fiber with the stone-retrieval basket. This improved instrument enables positioning of the basked and the optical fiber at the same time. The stone is safely trapped and fragmented without damaging surrounding tissue or the basket. Surgery time is shortened since only one instrument is needed.
This new device was developed by Endosmart GmbH in Stutensee, Germany together with OFS, a U.S. designer and manufacturer of specialty optical fiber.
A typical laser system for lithotripsy is based on Ho:YAG (Holmium:Yttrium-Aluminum-Garnet) laser which uses at a wavelength of 2123 nm with an average power of 30 W. Pulse duration, peak power and frequency are adjusted according to the individual treatment. For example, the laser pulse could be up to 18 kW peak power or 3.5 J pulse energy. To enable orientation of the instrument, the system delivers a visible red or green pilot light.
Light is guided even under extreme bending
The step-index multimode optical fiber used to guide the laser can have a pure silica core and a fluorine-doped glass cladding or a Germanium-doped core with a pure silica cladding. The different refractive indices of core and cladding enable the laser to propagate longitudinally in the fiber core. For guiding the light under extreme bending, an additional UV cured fluoroacrylate coating is applied. The fluoroacrylate coating has a lower refractive index than either of the glass claddings and acts as a secondary cladding for guiding the light. The optical fiber that is used with the nitinol basket described above has a core diameter of 272 µm and a silica cladding diameter of 299 µm. Around that, a 330 µm UV cured fluoropolymer coating is applied acting as a second optical cladding and finally, an ETFE buffer of 400 µm is applied.
Glass fibers are also used for medical diagnostics. Current developments are focused on simultaneous diagnosis and treatment.
What is IEEE Std 802.3cm-2020, 400 Gb/s over Multimode Fiber?
The work of the IEEE Std P802.3cm Task Force was approved as a new standard by the IEEE-SA Standards Board on 30 January 2020, creating the latest 400 Gb/s Ethernet standard using multimode fiber. 400 Gb/s is the highest Ethernet speed, and 400 Gb/s optical modules are needed in hyperscale (Google, Microsoft, Alibaba, and others) and very large-scale enterprise datacenters. 802.3cm defines 400 Gb/s solutions over both 4-pair (400GBASE-SR4.2) and 8-pair (400GBASE-SR8) multimode links. The IEEE P802.3cm Task Force was chaired by Robert Lingle, Jr., Senior Director of Market Strategy at OFS.
400GBASE-SR4.2 is the first multimode standard to use two wavelengths (850nm and 910nm), enabling 100 Gb/s transmission over a single fiber pair. It takes advantage of the multi-wavelength capabilities of OFS LaserWave® WideBand (OM5) fiber with 150 meter link distances, while supporting 100 meter links over LaserWave FLEX 550 (OM4) fiber and 70 meter links over LaserWave FLEX 300 (OM3) fiber. This builds on well-established 40 and 100G BiDi and SWDM technology that has been offered by switch and transceiver suppliers over the past decade. A key motivation for the 400GBASE-SR4.2 transceiver type is support of the installed base of multimode fiber cabling, designed around 100 meter reach over OM4 MMF, as well as extended reach over OM5 MMF, especially in large enterprise datacenters. 400GBASE-SR8 uses eight pairs of multimode fiber, with each pair supporting 50 Gb/s transmission. It operates over a single wavelength (850nm). OM4 and OM5 will support 100 meter links, while OM3 can support up to 70 meters. A key motivation for 400GBASE-SR8 is support of new cabling architectures in hyperscale datacenters.
What applications will use these links?
400 Gb/s multimode links can be used in a variety of applications. These include not only 400 Gb/s switch-to-switch (point-to-point) links, but several new applications, including 400GBASE-SR8 – 8x50GBASE-SR breakouts, or 400 Gb/s shuffles (fig. 1). The breakout application minimizes the number of ports on the Top-of-Rack (ToR) switch, providing connectivity to higher numbers of servers from a single switch. In similar fashion, the shuffle application allows a single 400 Gb/s switch port to support 100 Gb/s links to 4 different switches. 400GBASE-SR8 supports both flexibility and higher density: a 400G-SR8 OSFP/ QSFP DD transceiver can be used as 400GBASESR8, 2x200GBASE-SR4, 4x100GBASE-SR2, or 8x50GBASE-SR. 400GBASE-SR8 is already being deployed as 2x200GBASE-SR4. 5 THINGS YOU SHOULD KNOW ABOUT A Furukawa Company IEEE Std 802.3cm™-2020 For more information, visit our website at www.ofsoptics.com 1 2 Example: 400GBASE-SR8 – 8×50 Gb/s Breakout.
does this mean for hyperscale data centers?
Both 400GBASE-SR4.2 and 400GBASE-SR8 applications can be used for point to point 400 Gb/s links between switches. Additionally, new applications are being deployed in hyperscale data centers. As server speeds reach 50 and 100 Gb/s, racks will contain fewer servers, leading to a change in switch architecture away from ToR to Middle-of-Row (MoR) or End-of-Row (EoR) switches. Copper DAC links are reaching link distance and bandwidth limitations that will make it very difficult to support this change in architecture, leading to demand for a low cost, short reach optical solution. 400GBASE-SR8 provides support for eight 50 Gb/s server links from a single MoR or EoR switch port, significantly increasing bandwidth density on the switch faceplate.
What does IEEE Std 802.3cm mean for enterprise data centers?
400GBASE-SR4.2 is the first 400 Gb/s
standard that takes advantage of the 4-pair OM3/OM4/OM5 infrastructure many
enterprises installed earlier, first for 40 Gb/s Ethernet and later, 100 Gb/s
100GBASESR4, and 200 Gb/s 200G-SR4. It provides a graceful evolution path for
enterprise networks, using the same cable infrastructure through at least four
Ethernet generations. Future advances point toward the ability to support even
higher data rates as they become needed. It takes advantage of the latest
multimode fiber technology, OM5 fiber, using multiple wavelengths to transmit
100 Gb/s over a pair of fibers over 150 meters, compared to 100 meters for OM4
and 70 meters over OM3.
400GBASE-SR8 will be used in enterprise data centers as 50 Gb/s servers are deployed. Most enterprise datacenter servers operate at lower data rates, however with 10 Gb/s server links being quite common.
What is coming next?
IEEE has already created a study group to investigate the development of 100 Gb/s per wavelength multimode solutions, known as the “100 Gb/s Wavelength Short Reach PHYs Study Group.” This will enable support of next generation 100 Gb/s server ports, expected in 2021-2022. By providing native 100 Gb/s support, no expensive “gearboxes” will be required to combine 50 Gb/s lanes, providing a low cost, power efficient optical solution. Beyond 2021-2022 timeframe, once an 800 Gb/s Ethernet MAC is standardized, using this technology with two-wavelength operation could create an 800 Gb/s, four-pair link, while a single wavelength could support an 800 Gb/s eight-pair link.
Cladding diameter is the outer diameter of the glass portion of the optical fiber. For telecommunications fibers, this diameter has been 125 microns (µm) for a very long time. On the other hand, the diameter tolerance has not always been 0.7 µm.
During the 1980s, optical fibers had outer diameter tolerances as high as +/- 3.0 µm. As you can imagine, matching up fiber cores ranging from 122 to 128 µm in diameter could result in extremely high loss. This situation is why fusion splicing machines required additional technology to help align the fiber cores. This extra technology increased the price of the splicing units.
Mode field diameter (MFD) is another specification related to fiber geometry. In a typical G.652.D compliant single-mode optical fiber, not all of the light travels in the core; in fact, a small amount of light travels in the fiber cladding. The term MFD is a measure of the diameter of the optical power density distribution, which is the diameter in which 95% of the power resides.
Clad non-circularity measures a fiber’s deviation from perfectly round, and is measured as a percentage difference versus perfect.
Concentricity Error (Offset) of d 0.5 ¼m, < 0.2 ¼m typically
Core/clad concentricity error (CCCE) measures how well the core is centered in the fiber. CCCE is measured in microns and, of course, the closer the core is placed to perfect center, the better it is.
Although the difference between 200 and 250 µm is not
tremendously large, smaller diameter fibers can enable twice the fiber count in
the same size buffer tube, while also still preserving long-term reliability.
Fiber curl assesses the non-linearity of bare glass. In other
words, fiber curl measures how straight the glass fiber is when no external
stressors are present. If imbalanced stresses are frozen into a fiber during
the draw process, curl can result. This curl can show up during the splicing of
fiber optic ribbons or when fixed V-groove splicing machines are used.
In closing, fiber geekdom is a journey, not a destination, and there’s always more to learn. OFS has multiple decades of experience with fiber-optic cable networks. Please contact your local OFS representative if you would like additional information regarding optical fiber geometry specifications.