Goodbye, CRT-IEEE Spectrum

2021-11-12 10:11:10 By : Ms. Kiana Qiu

The IEEE website will place cookies on your device in order to provide you with the best user experience. By using our website, you agree to the placement of these cookies. To learn more, please read our privacy policy.

The next TV you buy will not include a cathode ray tube. As the world is about to transition from analog TV to digital TV, you may soon buy a new TV—you and more than 1 billion other consumers around the world.

With the full digitization of the United States in 2009 and other developed countries following closely, the entire television landscape will undergo tremendous changes in the next few years. What technology will dominate this new television world? It will be flat, that's for sure. All new TV technologies that attempt to replace the cumbersome, 100-year-old CRT are stylish and slim.

But is it plasma, liquid crystal, or one of several new technologies that have not yet been put on the shelves? Will they all survive, or will some of them have inherent technical weaknesses that will destroy them in the next five or six years? I will tell you where I bet, but first, I will look at all the horses in this game.

Today, plasma TVs are something that many consumers want to hang on the wall. They can afford LCD TVs. Both are flat-screen systems with image quality, which are comparable to movies when displaying high-definition video. In the past few years, the prices of both have dropped significantly; the 42-inch diagonal plasma device including the tuner now sells for about $2,000, compared to $5,000 four years ago. A 37-inch LCD monitor is the more popular size of this technology, priced at approximately US$1,200.

According to Gartner Dataquest, nearly 8 million plasma TVs will be produced worldwide this year. Gartner Dataquest, a division of Gartner headquartered in Stamford, Connecticut, is almost twice the output in 2005. Gartner estimates that by the end of the year, LCD manufacturers will launch nearly 42 million of these popular TV monitors. Most of these devices, plasma and LCD are produced in Japan, South Korea and Taiwan.

Plasma TVs and LCD TVs have much larger screens than CRTs, but plasma screens can be absolutely huge: the largest screen available today has a diagonal of 102 inches. The plasma image is also quite bright and can be viewed clearly from almost any angle. Today's LCD has a wider viewing angle than earlier models, but the field of view is still not as wide as other technologies.

Nevertheless, the plasma TV will not be the last TV you buy. The reasons are as follows: it has a limited lifespan, consumes electricity and is heavy. Like CRTs, plasma displays use red, green, and blue phosphors, but instead of directly hitting the phosphors with electron beams like CRTs, plasma displays consist of xenon and xenon trapped between two glass panels with honeycombs. Neon gas charges the cells.

Essentially, each plasma display contains about one million miniature fluorescent tubes, which are evenly distributed in red, green, and blue. When the charged or ionized gas releases ultraviolet photons, these photons collide with the phosphor and emit the colored light that produces the television picture.

The lifetime problem comes from the fact that the luminous efficiency of the phosphor coating will decrease over time—that is, when the phosphor is stimulated by photons, it emits less and less light. This problem is much more serious in a plasma device than in a CRT, because the phosphor of the plasma exists in a harsh environment. Compared with the hot gas of plasma, the electron beam in CRT is more friendly to phosphors. In a plasma display, the contrast ratio—the difference between lit and unlit picture elements—decreases rapidly under normal use, dropping by as much as 50% in four to five years. At this time, the TV image appears to be visibly faded.

Today's manufacturers claim 60,000 hours of use (based on hundreds of hours of testing) before the brightness dropped by half. However, contrast is more important than brightness. A recent test conducted by IDC, a market research company in Framingham, Massachusetts, showed that after 4 weeks of use for a typical plasma TV, the darkness of black has dropped by 13%; after five years of use, such a rapid decline may cause Black is displayed as light gray.

Plasma displays also consume more power. Although manufacturers have reduced the power consumption of typical plasma technologies by 30% in the past five years, these devices still require more power than comparable LCD TVs, especially when displaying white or light-colored screens. This power consumption generates heat; if the device is not properly cooled, the heat buildup can damage components. Before you buy a plasma TV, please consider the following facts: This spring, Philips sent maintenance technicians to 12,000 American homes to replace components in plasma TVs that may have overheated.

Plasma displays are also heavier than their flat-panel competitors. Since the glass panel surrounding the gas in the plasma display is much thicker, the 40-inch plasma group weighs 43 kg, while the same size plasma group weighs only 25 kg. Plasma technology requires such thick glass because the gas is very hot; thin glass will melt.

There are other problems, such as aging. Now this is a special problem, because with 1000-channel cable and satellite services, TV networks have always had an urgent need to identify themselves, usually with a static channel logo at the bottom of the screen. Similarly, because plasma technology is more difficult on phosphors than CRT technology, aging occurs faster and more pronounced on plasma TVs. The manufacturer has done a lot of work to solve this problem. On the new plasma equipment, after about 12 hours of use without a static image, the aging will disappear. But this is still a disadvantage.

It seems that these problems are not enough, and the plasma group does not work well at high altitudes or in fact in any place where the ambient pressure is different from the internal gas pressure. When this difference exists, the TV's power supply must work harder to keep the gas ionized.

To be sure, plasma manufacturers have been working hard to solve the shortcomings of the technology. They developed phosphors with a longer lifespan and made considerable progress in controlling light leakage between cells, successfully displaying deeper blacks. However, the cost of plasma devices with these improvements is significantly higher than that of competing LCD devices. In any case, most ordinary consumers buy cheaper suits, but the problem still exists.

So, what about LCD, today's most obvious plasma substitute? The LCD TV is actually a sandwich with multiple ingredients. Its layers include a bright white backlight, a layer of liquid crystal molecules, a thin film transistor matrix, two pieces of polarized glass and color filters. The transistor controls the voltage applied to the three groups of liquid crystal molecules constituting each pixel.

When the voltage is turned on, it will distort the molecules, allowing light to pass through the glass layer and the color filter; when the voltage is turned off, the molecules will unravel and block the light. Each pixel is composed of liquid crystal molecules located above the red, green, and blue filters. Turning on and off the appropriate molecules will produce countless combinations of red, green and blue light, thus forming a palette of human vision.

Because of this multilayer structure, the contrast of LCD TVs is relatively poor-the difference between the brightest white and the darkest black that the screen can display. The white backlight that illuminates the display is usually a cold cathode fluorescent tube, which works on the same principle as the ubiquitous ancient neon sign. The light has a rough path through many layers before reaching the observer's eye. Each layer will absorb some light, resulting in a decrease in contrast and brightness.

LCD is basically reliable. The component that limits its life is the backlight. Fluorescent tubes age over time; after about five years of normal household use, the tubes begin to darken and their color temperature begins to drift—that is, the hue of the emitted light changes from pure white to the red end of the spectrum. This transition is gradual; viewers usually do not notice the change until it becomes extreme. Due to this aging of fluorescent tubes, the average service life of LCD TVs is about 7 to 10 years-close to the service life of ordinary CRT TVs.

In order to increase the lifespan of LCD TVs, some manufacturers have recently begun to use high-intensity light-emitting diodes (LEDs) as backlights. They are not cheap, but as the volume of manufacture increases, the price will drop. Samsung sells 46-inch models for about $9,000; Sony is selling similar devices for about $12,000. In these models, instead of fluorescent lamps, an array of red, green, and blue LEDs produces what appears to be white light.

In addition to increasing the lifespan of the LCD, this LED-based lighting also increases the color saturation of the display. Saturation is basically the purity of the color, or more accurately, the relative bandwidth of light. Light emitted in a narrower bandwidth is more saturated; light that occupies a wider bandwidth appears to fade.

For example, when the color filter on an LED-illuminated display removes blue and green to display red pixels, the red produced has a single red frequency originally produced by the red LED. The same filtering is performed to display red on a fluorescent lighted display, and a wider red frequency range will produce a less saturated color.

Saturation is particularly important for large HD LCD panels larger than 37 inches, because image quality problems are more obvious in larger panels. Increasing saturation can achieve finer color gradation, making the picture look very vivid, and this effect is especially eye-catching in landscapes. A surfer on a red board suddenly appears in the vast blue ocean; this kind of image may even make you addicted to the travel channel.

But even LEDs will not last forever. After about 60,000 hours of use, degradation begins to become apparent-for most people, this means about 15 years. Although these LED/LCD TVs contain sensors to measure and adjust their hue as the diodes age, they will also fade and the panel will darken after about ten years of moderate use. They consume approximately twice the power of traditional fluorescent liquid crystal displays. The power of a 42-inch LCD with LED backlight is about 250 to 300 watts, which is only slightly lower than that of a plasma panel.

Soon-about two years later-there will be a third horse participating in this race. The surface conduction electron emitter display (SED) has just begun to become a strong competitor to replace the CRT. SED is an emerging alternative flat panel display technology from Canon and Toshiba.

In SED [see chart, "SED Science"], each pixel of the display is actually a cathode ray tube. The cathode is a layer of palladium oxide film, which was chosen because it is conductive and very durable, and can resist oxidation and corrosion even at high temperatures. Just like in CRT, the electrons emitted from the cathode will hit the phosphor-small dots of metals or rare earth compounds, which emit red, green or blue light when energized.

The result is that flat panel displays use less energy than plasma screens, but the image quality is close to CRT, which is still the benchmark for all displays. Compared with plasma, the power consumption is lower for the same reason as CRT: the energy required to generate the electron beam is much less than the energy required to excite photons in the gas.

SED is a variant of field emission display technology [see "Watching the Nanotube", IEEE Spectrum, September 2003]. The main difference is that the cathode of the SED uses a palladium oxide film instead of the tapered carbon nanotube bundle used in field emission displays. (So ​​far, nanotube-based field emission displays have proven difficult to manufacture, mainly because it is difficult to manufacture nanotubes.) SEDs theoretically have manufacturing advantages because they can be printed with industrial inkjet printers, and are compatible with your home or office. Inkjet printer.

Last year, Toshiba and Canon began trial production of surface-conduction displays ranging from 40 to 50 inches. Despite the theoretical advantages in manufacturing, these companies seem to be having problems increasing production to a commercially viable level. They said they will start mass production in July next year and ship SED TVs to Japanese retailers later in 2007.

The retail cost of the first batch of SED TVs may be about 50% higher than similar plasma TVs. At present, it is not clear whether they will encounter any long-term reliability or performance issues. Next year we should know whether surface conduction technology really has the opportunity to open up an important niche market for itself; if Canon and Toshiba cannot produce large enough reliable products at realistic retail prices, the market will not buy the technology.

So which technology will dominate in the next four to five years? There will be two winners, one with a screen size less than 50 inches and the other with a screen size greater than 50 inches. 50 is a magic number in the television industry, because at least today, it is the upper limit for economically producing reliable panels for placing electronic devices on glass substrates. It is also the smallest projection TV screen size in mass production today, which is no coincidence.

For screens below 50 inches (accounting for the vast majority of global sales), the two most attractive technologies will be LCD, conceivably, SED. However, the price of SED is unlikely to drop significantly before 2010. Therefore, the near-term winner will be LCD.

LCD technology will not only dominate the 37-inch or smaller TV category, but also the medium-sized market for 40 to 50-inch TVs. The low cost and long-term reliability of LCDs will make them more valuable than SED or plasma displays.

For consumers looking for the largest screen (greater than 50 inches), projection TVs will be the best choice in the near future. As the size of plasma, liquid crystal, and surface conduction displays increases, yields decrease, and therefore costs increase. A production line that produces four 42-inch displays at the same time can only produce one 100-inch display, and large displays are more prone to failure areas, thereby reducing yield. Nevertheless, 103-inch plasma screens will be available soon, as will 65-inch LCDs. But the price will be ridiculous-about 70 000 US dollars and 15 000 US dollars respectively.

Projection TV has gotten better recently. If you cringe at the memory of the shadows and faded images of the projected TV 20 years ago, you will be surprised. Today's projection technologies include Digital Light Processing (DLP) from Texas Instruments (TI), as well as micro LCDs and liquid crystal on silicon (LCOS) used in TVs from companies such as HP, JVC, Mitsubishi, RCA, Sony, and Samsung.

Of the 10 million projection TVs that Gartner estimates will be produced this year, it is estimated that 25% will be based on DLP, 9% will be based on LCOS, and 66% will be based on micro LCD. All three of these technologies can provide brighter and clearer pictures than those viewed by audiences in traditional cinemas today. Many models also display deeper blacks and correspondingly higher contrast than most LCD or plasma displays.

The HD DLP system contains more than 2 million hinge-mounted micromirror arrays, each of which is approximately 20 square millimeters. A bright white light shines on the array. The mirror changes direction to reflect light to the screen or not-that is, to brighten or darken individual pixels on the screen. The mirror controlled by microelectrodes can switch thousands of times per second.

The system coordinates the switching by rotating a single color wheel (usually 7 cm in diameter), allowing the mirror to create the red, green, and blue components of each of the millions of pixels in the TV image. In order to keep up with the National Television System Committee (NTSC) video refresh rate of 30 frames per second, the wheels must move precisely and quickly-the first-generation color wheel, with three color segments, rotates about 3600 revolutions per minute. Today's color wheel has seven color segments (two each for red, green, and blue, plus one white) rotating at about 7200 rpm [see illustration, "mirror, mirror"].

LCOS displays also redirect reflected light to create TV images, but they use separate liquid crystals instead of micromechanical mirrors. Liquid crystals coat the reflective surface, usually on a 15 square millimeter silicon chip, and change their direction to block or allow light to reach the reflective surface. In a single-chip LCOS system, the color wheel or LED array illuminates the LCOS chip. In the multi-chip LCOS technology, three independent chips, one for each primary color, are optically combined to produce a visible image.

The third competing projection technology, micro-LCD, uses three transparent LCDs, each of which is used for the red, green, and blue components of a full-color image. The diagonal size of each LCD is 18 to 33 mm, depending on the specific manufacturer and model. The mirror divides the light emitted by the metal halide lamp into three beams of red, green and blue, and each beam passes through the corresponding LCD. Three beams of light enter the prism through the LCD, and the prism combines the light into a single beam to form a full-color image.

Each of these technologies has small drawbacks. Over time, the heat of the high-intensity metal halide projection lamp will reduce the liquid crystal coating in the micro LCD panel, thereby discoloring the TV picture. The rotating color wheel of the single-chip DLP and LCOS system can create a rainbow effect for some viewers, because it relies on the human visual system to retain the image immediately after the image is actually no longer visible, thereby fusing the red, green, and blue images together. For one. Some people’s eyes adapt better than others. The rainbow is most obvious when the picture has a large contrast, such as a candle on a black background. In a football game with a lot of movement and details, the rainbow effect is almost insignificant.

All projection systems have a major problem: the light bulb.

Projection systems often use metal halide projection lamps because they are very bright and provide consistent color levels and brightness during their lifetime. These lamps emit light by passing an arc through a high-pressure mixture of argon, mercury, and various metal halide gases. The precise mixing of halides affects the properties of the light produced, affecting the associated color temperature and spectral intensity (for example, making the light bluer or redder).

The argon gas in the lamp is easily ionized, creating an arc between the two electrodes. The heat generated by the arc evaporates mercury and metal halides, which generate light as temperature and pressure increase. Approximately 24% of the energy used by metal halide lamps produces light, which makes them generally more efficient than fluorescent lamps and more efficient than incandescent bulbs such as halogen lamps.

But these lights can only be used for 1,000 to 2,000 hours, and they are not cheap to replace, each costing $300 to $400. Lamps that can last longer, such as Ultra High Performance (UHP) lamps invented by Philips. These lamps generate arcs in almost pure mercury vapor under high pressure. The arc gap can be much smaller than alternative lamp technology, as small as 1.3 to 1.0 mm.

The smaller the gap, the higher the efficiency; the 100-watt UHP lamp in the projector can provide more light to the screen than the 250-watt metal halide lamp. UHP lamps have a power range of 100 to 200 W and a service life of 3000 to 10 000 hours. They are now available for video projectors and rear projection TVs from all major manufacturers.

But like LCD manufacturers, projection TV manufacturers are moving towards replacing light bulbs with high-intensity LEDs, which may become widespread in the next three to four years. These LEDs will not be cheap, but their service life should reach tens of thousands of hours. Such a service life will make the maintenance and operating costs of the projection system comparable to those of other available TV monitors.

Projection TVs are also smaller than before: because micro-displays replaced earlier large tube technology, the boxes containing projection optics and electronics are much lighter than their predecessors 20 years ago. Therefore, the average depth of today's 50-inch projection TVs is only 0.43 meters.

By 2010, LCD TVs will dominate in number, but mainly due to smaller screen sizes. Gartner predicts that nearly 90 million LCD TVs will be produced worldwide in that year alone, and retail sales will reach approximately US$30 billion. Projection TV output will grow steadily, and 14 million units will be produced in 2010. DLP technology will occupy the largest share of the projection TV market at 47%, compared with 35% for LCOS and 18% for micro LCD.

At the same time, plasma technology will gradually die out and become a victim of economics rather than shortcomings, because today's manufacturers invest far more in LCD production than plasma, and other technologies are also appearing in the laboratory. Manufacturers such as Panasonic and Pioneer have invested billions of dollars in plasma manufacturing facilities in the past five years, and they have either become niche market players or switch to another technology, most likely LCD. Although if Canon and Toshiba's SED technology is really successful, Panasonic and Pioneer may join the trend.

Until around 2015, you will not be satisfied with LCD TVs in the bedroom and kitchen, and projection TVs in the family room. By then, the next wave of display technology breakthroughs (organic LEDs, anyone?) [see sidebar, "Now on small screens"] will bring paper-thin TVs out of the laboratory and into the market, and a battle for TV technology The dominance of the new competition once again sends you TV shopping.

PAUL O'DONOVAN is the principal research analyst at Gartner Dataquest, headquartered in Egham, England, responsible for semiconductors and consumer electronics. Before joining Gartner 10 years ago, O'Donovan worked as a marketing engineer for 12 years at National Semiconductor in Santa Clara, California.

For more information about Toshiba and Canon's SED technology, please refer to

For more information about Texas Instruments DLP technology, please visit

Qualcomm's iMoD technology is explained in

Chinese tech giants are struggling to gain autonomy from U.S. chipsets

Craig S. Smith is a former New York Times reporter and host of the podcast Eye on AI.

View of Yitian 710, an ARM-based server processor developed by Alibaba.

With the announcement of Alibaba's design of a 5nm technology server chip based on Arm Ltd.'s latest instruction set architecture, China has taken another step towards semiconductor independence.

However, despite this impressive feat, a more important chip design development by the Chinese technology giant may be providing source code for the RISC-V CPU core designed by its own engineers. This means that other companies can use it in their own processor designs-and avoid architectural licensing fees. (The company made two announcements at its annual cloud conference in its hometown of Hangzhou last month.)

The Chinese government is funding many start-up companies that are designing various chips. In the first five months of 2021, the number of newly registered Chinese chip-related companies more than tripled over the same period last year. China's largest technology companies such as Alibaba, Baidu and Huawei are developing their own chips instead of relying on chips from Intel, Nvidia and other American companies.

"Flagship technology companies like Alibaba can help start the semiconductor industry by manufacturing very advanced chips," said semiconductor consultant Linley Gwennap.

China intends to develop semiconductor independence in the design and manufacture of the most advanced chips. The urgency of this is due to the US sanctions on Chinese telecom giant Huawei, which makes it impossible for the company to use foreign-made chips. The sanctions apply to any Huawei supplier that uses US parts or technology.

The United States was shocked by China's actions to place Taiwan under its control. After allowing most of the semiconductor manufacturing industry to migrate to Taiwan, it also embarked on an ambitious plan to "re-support" its semiconductor manufacturing industry. About 80% of the world's semiconductor production capacity is in Asia, and almost all the most advanced logic chip production is in Taiwan. Currently, no Chinese semiconductor foundry has reached the 5-nanometer process required to manufacture Alibaba's new ARM-based chips, so it still relies on Taiwanese manufacturing.

But in the long run, the influence of Alibaba on the general choice of Arm and RISC-V instruction set architecture may be more important. The instruction set architecture or ISA is the language of software and hardware dialogue, so it determines the type of software that can run on a particular chip. Most servers use CPUs based on Intel's x86 instruction set architecture. But the British-based Arm authorized its instruction set architecture to chip designers and has established a firm foothold in this market.

The RISC-V instruction set architecture has fewer additional strings. RISC-V refers to the fifth-generation open source simplified instruction set computer architecture created by American researchers. It is free, so it is not affected by geopolitical crosswinds.

China has two industrial groups that promote RISC-V: China Open Command Ecosystem Alliance and China RISC-V Industry Alliance. In June of this year, China hosted the fourth annual RISC-V summit, where industry, academia, and government gathered to discuss the future of architecture.

After the US sanctions, Huawei was also banned from using Google’s Android operating system. Huawei released its first RISC-V development platform to help engineers use its own Harmony operating system for smartphones, IoT gadgets, and other The so-called edge device. Due to sanctions unable to purchase Intel chips, Huawei recently sold its x86 server unit to a company in China's Henan Province.

Alibaba launched its first RISC-V processor in 2019 and was hailed as the most advanced RISC-V chip at the time. From the beginning, the company expressed its intention to open the source code of the CPU-a hardware description language that describes the structure and behavior of the electronic circuit of the CPU core. It has done so now...without fanfare.

"If Intel made the same statement on the design of the x86 instruction set microprocessor, it would be a big deal," said David Patterson, one of the creators of RISC-V.

As more and more chip and software vendors adopt this architecture, RISC-V is gradually surpassing Arm and Intel. Patterson pointed out that all NVIDIA GPUs use RISC-V, Samsung phones use RISC-V, and most open source tools are suitable for RISC-V. "RISC-V shipments have reached billions," he said, adding that Alibaba alone has shipped more than 1 billion cores using RISC-V. At the same time, several other open source RISC-V kernels are already available on the Internet.

With RISC-V processors for low-power tasks and custom Arm server CPUs for general-purpose computing, Alibaba now has a full range of computing infrastructure. Its Yitian 710 server system-on-chip (SoC) manufactured by Taiwan Semiconductor Manufacturing Co., Ltd. will have a total of 128 Arm-based cores, integrate 60 billion transistors, and have a maximum clock speed of 3.2GHz. Alibaba said that this is the first server processor compatible with the latest Armv9 architecture.

Alibaba said that the SoC scored 440 points in SPECint2017 (a standard benchmark for measuring CPU integer processing power), surpassing the current state-of-the-art Armv8-based Arm server processor by 20% and more than 50% in performance. . energy efficiency.

The company also announced the development of a proprietary server called Panjiu developed specifically for the next generation of cloud-native infrastructure. By separating computing from storage, the server is optimized for general and dedicated AI computing and high-performance storage.

At the same time, by opening the source code of its RISC-V Xuantie series of IP cores, developers will be able to build their own prototype chips to customize for different IoT applications. Alibaba has also opened up the software stack related to Xuan Tie, supporting multiple operating systems, including Linux, Android, RTOS and Alibaba's own AliOS. The company promises to provide more services and support for RISC-V development tools, software development kits and custom kernels in the future.

Consultant Gwennap said that Alibaba's Arm and RISC-V efforts are more experiments than commercial efforts, and pointed out that most of Alibaba's internals are still using x86 Intel chips. Gwennap said: "These companies talk a lot about Intel's alternatives." "But in the final analysis, they don't want to eat their own dog food."

Alibaba's newly launched Arm-based server chip will be used in Alibaba's data center to provide cloud services to customers. The company will continue to provide Intel-based services, so customers can choose Arm instead of x86-based chips. When Amazon did something similar a few years ago, Arm-based chips were hardly adopted.

But true semiconductor independence will require China to develop its own extreme ultraviolet lithography machine and to etch microcircuits on silicon. China's main chip foundry, SMIC, cannot provide any products smaller than 14 nanometers. SMIC claims to have mastered the 3nm chip process in the laboratory and is trying to purchase the EUV lithography machine needed for production from the Dutch company ASML, which currently monopolizes key equipment. But the United States intends to block this deal. (3 nm refers to the next reduction and tighter spacing of the smallest semiconductor feature size to allow increased transistor density, but does not refer to the actual size of the transistor gate or other features on the processor.)

The Chinese Academy of Sciences has an EUV lithography research team, and Tsinghua University has developed a new particle accelerator light source that can be used for EUV lithography. However, it will still take many years to get this technology out of the laboratory and into the machine.

Japanese startup is committed to developing autonomous robots that can perform useful tasks inside and outside the space station

At the end of last year, Japanese robotics startup GITAI sent their S1 robotic arm to the International Space Station as part of a commercial airlock expansion module to test some useful space autonomy. Everything on the International Space Station runs very slowly, so it wasn’t until NASA astronauts installed the S1 arm last month that GITAI was able to let the system complete its pace—or rather, sitting on a comfortable chair on Earth. , Watching the arm do the most thing its task itself, because that’s a dream, right?

The good news is that everything went well, and the arm did everything GITAI wanted it to do. So what is the next step for commercial autonomous robots in space? The CEO of GITAI tells us what they are doing.

One of the advantages of working in space is that it is a highly structured environment. Microgravity may be a little unpredictable, but you know the characteristics of objects (and even lighting) very well, because everything there is over-defined. Therefore, things like using a two-finger gripper to complete relatively high-precision tasks are entirely possible because the system has to deal with very little changes. Of course, things always go wrong, so GITAI also tested the Teleop program from Houston to ensure that participation is also an effective way to complete the task.

Since complete autonomy is much more difficult than almost complete autonomy, occasional telemetry may be crucial for various space robots. We spoke with Sho Nakanose, CEO of GITAI, to learn more about their approach.

IEEE Spectrum: How much autonomy do you think the robots working in the International Space Station should have?

Sho Nakanose: We believe that combining 95% of autonomous control and 5% of remote judgment and remote operation is the most efficient way of working. In this demonstration of the International Space Station, all work is done under 99% autonomous control and 1% remote decision-making. However, in the actual operation of the International Space Station, there will be irregular tasks that cannot be handled by autonomous control. We believe that such irregular tasks should be handled by ground remote control. Therefore, we believe that the final ratio is about 5%. Judgment and remote control will be the most effective of.

GITAI will apply the universal autonomous space robot technology, know-how and experience gained through this technology demonstration to develop an extravehicular robot (EVR) that can perform docking, repair and maintenance tasks for on-orbit services (OOS) or carry out lunar exploration and Various activities such as the construction of the moon base. -Sho Nakanose

I’m pretty sure that you tested the system many times on the ground before sending it to the International Space Station. How is operating a robot on the International Space Station different from your tests on Earth?

The biggest difference between ground experiments and ISS experiments is the microgravity environment, but it is not that difficult to deal with. However, experiments conducted on the International Space Station in an unknown environment that we have never been to will encounter various unexpected situations that are difficult to handle, such as communication interruption due to the failure of the thruster ignition experiment. On the Russian module. However, we were able to solve all problems because the development team made careful preparations for violations in advance.

It seems that robots are performing many tasks using equipment designed for humans. Do you think it would be better to design things like screws and control panels to make the robot easier to see and operate?

Yes, I think so. Unlike the International Space Station built in the past, it is expected that humans and robots will be built to work together in the lunar space station Gateway and the lunar base in the future. Therefore, it is necessary to design and implement an interface that is easy to use for both humans and robots. In 2019, GITAI received an order from JAXA to develop guidelines for easy-to-use interfaces for humans and robots on the International Space Station and gateways.

What are you going to do next?

We plan to conduct an out-of-orbit demonstration in 2023 and a moon demonstration in 2025. We are also working on space robot development projects for some of our customers who have already received orders.

Understand the cross-structural (cross-chip, package, circuit board, and system) thermal and stress challenges that 3D-IC brings, and understand how thermal solvers can help you analyze the impact and develop strategies to mitigate this impact.