Plessey goes into LED light bulb ‘filament’ production

Plessey Filament Bulb Plessey has launched a range of LED ‘filaments’ based on GaN-on-Si die made in Plymouth.

“The filaments are designed for the surging filament bulb market where these replacement lamps have far better performance, but maintain the physical appearance of incandescent lamps,” said the firm.

Called the PLF series, the chip-on-board filaments create the same amount of light as an incandescent filament, while consuming less energy and lasting longer.

Terminations are unique, said Plessey, as they can be handled and spot welded by existing high volume automated glass lamp manufacturing lines, and the firm has incorporated a mechanism to control current and Vf of the filaments when filaments are driven in a bridge configuration.

Plessey Filament“Plessey will also be incorporating other active and passive electronic components for chip-on-board and chip-scale packaging solutions in next generation of filaments,” said company CTO Dr Keith Strickland.

‘PLF’ series filaments come in a variety of lengths, light outputs, with colour temperatures from very warm 2,200K to sunlight-cool 6,500K.

In November last year, Plessey pushed its GaN-on-Si LEDs through the 120 lm/W barrier. It has been developing LED production in Devon from a standing start when it acquired University of Cambridge spin-out CamGan in early 2012.

 

steve bush

Plessey goes into LED light bulb ‘filament’ production

Plessey Filament Bulb Plessey has launched a range of LED ‘filaments’ based on GaN-on-Si die made in Plymouth.

“The filaments are designed for the surging filament bulb market where these replacement lamps have far better performance, but maintain the physical appearance of incandescent lamps,” said the firm.

Called the PLF series, the chip-on-board filaments create the same amount of light as an incandescent filament, while consuming less energy and lasting longer.

Terminations are unique, said Plessey, as they can be handled and spot welded by existing high volume automated glass lamp manufacturing lines, and the firm has incorporated a mechanism to control current and Vf of the filaments when filaments are driven in a bridge configuration.

Plessey Filament“Plessey will also be incorporating other active and passive electronic components for chip-on-board and chip-scale packaging solutions in next generation of filaments,” said company CTO Dr Keith Strickland.

‘PLF’ series filaments come in a variety of lengths, light outputs, with colour temperatures from very warm 2,200K to sunlight-cool 6,500K.

In November last year, Plessey pushed its GaN-on-Si LEDs through the 120 lm/W barrier. It has been developing LED production in Devon from a standing start when it acquired University of Cambridge spin-out CamGan in early 2012.

 

steve bush

NTSB concludes Virgin Galactic SpaceShipTwo crash review

Virgin Galactic SpaceshipTwo

Virgin Galactic SpaceshipTwo

Human error, not mechanical fault, caused the crash of the Virgin Galactic SpaceShipTwo last October, a review has concluded. But the companies behind the craft and the regulator that approved its flight are not off the hook.

The crash was a result of co-pilot Michael Alsbury unlocking the spacecraft’s descent system too early, according to a review conducted by the US National Transportation Safety Board (NTSB) and published on Tuesday.

But Virgin Galactic and Scaled Composites, which built the craft, did not do enough to mitigate the risks of this occurring, and the Federal Aviation Administration (FAA), which issues commercial spaceflight permits, did not pick up on their oversight, the review concludes.

Alsbury was killed in the crash, while pilot Peter Siebold was seriously injured. “We cannot undo what happened, but it is our hope that through this investigation we will find ways to prevent this from happening again,” said NTSB chairman Christopher Hart during an NTSB board meeting in Washington DC earlier today.

Feather wings

SpaceShipTwo is a sub-orbital vehicle designed for tourist flights to the edge of space. As it descends back to Earth, its “feather” wings are meant to rotate upwards to provide drag and slow the craft. Initial analysis of video from the cockpit showed Alsbury unlocking SpaceShipTwo’s wings prior to the crash as it was travelling at 0.92 Mach, just below the speed of sound, not at 1.4 Mach as intended. This meant the wings extended too early and were subject to extreme stress that ultimately led to the break-up of the spacecraft.

So why did Alsbury unlock the wings at the wrong time? The NTSB pointed to a number of contributing factors. Scaled Composites’ hazard analysis did not consider the possibility that pilots would make a mistake during normal flight, only that they might take the wrong action in response to an issue with the vehicle. As such, there were no warnings or limitations on the spacecraft controls to prevent early unlocking.

In May 2013, the FAA told Scaled Composites that it had concerns that its hazard analysis did not meet the requirements for an experimental flight permit. But a few months later, in July 2013, the FAA issued a waiver for these requirements and granted the permit. The NTSB’s investigation found that Scaled Composites did not request this waiver, and some FAA inspectors were unfamiliar with SpaceShipTwo and thought that the requirements had been met.

Virgin Galactic, which has since taken over full responsibility for the vehicle from Scaled Composites, has now modified the spacecraft to prevent the wings from unlocking too early. The company has also modified the flight checklist to warn against unlocking at the wrong time and says that it will post copies of its own submission to the NTSB online later today. “We thank the @NTSB for their professionalism, expertise, and insight, and we welcome the results of their investigation,” said the company in a tweet.

The NTSB proposed eight safety recommendations for the FAA to improve its review processes, including that it should work with private space firms before they start designing their vehicles, not just before they fly, and gave similar recommendations to the spaceflight industry. “Hundreds of people whose only qualification for spaceflight is their ability to purchase a ticket await the opportunity to go into space,” said Hart. “For such flights to proceed safely, commercial space transportation must continue to evolve and mature.”

Syndicated content: Jacob Aron, New Scientist

See alsoFirms must explore new aerospace business

See alsoSpace: Virgin Galactic ship tests its feathering

 

Alun Williams

NTSB concludes Virgin Galactic SpaceShipTwo crash review

Virgin Galactic SpaceshipTwo

Virgin Galactic SpaceshipTwo

Human error, not mechanical fault, caused the crash of the Virgin Galactic SpaceShipTwo last October, a review has concluded. But the companies behind the craft and the regulator that approved its flight are not off the hook.

The crash was a result of co-pilot Michael Alsbury unlocking the spacecraft’s descent system too early, according to a review conducted by the US National Transportation Safety Board (NTSB) and published on Tuesday.

But Virgin Galactic and Scaled Composites, which built the craft, did not do enough to mitigate the risks of this occurring, and the Federal Aviation Administration (FAA), which issues commercial spaceflight permits, did not pick up on their oversight, the review concludes.

Alsbury was killed in the crash, while pilot Peter Siebold was seriously injured. “We cannot undo what happened, but it is our hope that through this investigation we will find ways to prevent this from happening again,” said NTSB chairman Christopher Hart during an NTSB board meeting in Washington DC earlier today.

Feather wings

SpaceShipTwo is a sub-orbital vehicle designed for tourist flights to the edge of space. As it descends back to Earth, its “feather” wings are meant to rotate upwards to provide drag and slow the craft. Initial analysis of video from the cockpit showed Alsbury unlocking SpaceShipTwo’s wings prior to the crash as it was travelling at 0.92 Mach, just below the speed of sound, not at 1.4 Mach as intended. This meant the wings extended too early and were subject to extreme stress that ultimately led to the break-up of the spacecraft.

So why did Alsbury unlock the wings at the wrong time? The NTSB pointed to a number of contributing factors. Scaled Composites’ hazard analysis did not consider the possibility that pilots would make a mistake during normal flight, only that they might take the wrong action in response to an issue with the vehicle. As such, there were no warnings or limitations on the spacecraft controls to prevent early unlocking.

In May 2013, the FAA told Scaled Composites that it had concerns that its hazard analysis did not meet the requirements for an experimental flight permit. But a few months later, in July 2013, the FAA issued a waiver for these requirements and granted the permit. The NTSB’s investigation found that Scaled Composites did not request this waiver, and some FAA inspectors were unfamiliar with SpaceShipTwo and thought that the requirements had been met.

Virgin Galactic, which has since taken over full responsibility for the vehicle from Scaled Composites, has now modified the spacecraft to prevent the wings from unlocking too early. The company has also modified the flight checklist to warn against unlocking at the wrong time and says that it will post copies of its own submission to the NTSB online later today. “We thank the @NTSB for their professionalism, expertise, and insight, and we welcome the results of their investigation,” said the company in a tweet.

The NTSB proposed eight safety recommendations for the FAA to improve its review processes, including that it should work with private space firms before they start designing their vehicles, not just before they fly, and gave similar recommendations to the spaceflight industry. “Hundreds of people whose only qualification for spaceflight is their ability to purchase a ticket await the opportunity to go into space,” said Hart. “For such flights to proceed safely, commercial space transportation must continue to evolve and mature.”

Syndicated content: Jacob Aron, New Scientist

See alsoFirms must explore new aerospace business

See alsoSpace: Virgin Galactic ship tests its feathering

 

Alun Williams

QuickLogic introduces multi-core EOS sensor hub

EOS - QuickLogic Flexible Fusion Engine S2-Sensor Hub Block-Diagram

EOS – QuickLogic Flexible Fusion Engine S2-Sensor Hub Block-Diagram

QuickLogic has introduced a triple core sensor hub called EOS.

The justification for sensor hubs is that: “It is power-prohibitive to do sensor processing on the host processor,” according to QuickLogic vice-president Brian Faith.

The three cores are: an ARM Cortex M4F MCU, a front-end sensor manager and a QuickLogic proprietary core which it calls Flexible Fusion Engine (FFE). A fourth core could be integrated into the hub’s FPGA fabric.

The FFE and and sensor manager handle the bulk of the algorithm processing, which minimises the duty cycle for the floating point MCU.

This approach lowers aggregate power consumption, and enables mobile, wearable and IoT device designers to introduce next generation sensor-driven applications, such as pedestrian dead reckoning (PDR), indoor navigation, motion compensated heart rate monitoring, and other advanced biological applications within their power budgets.

The EOS platform includes a hardened subsystem specifically designed for always-listening voice applications.

With its dedicated PDM-to-PCM conversion block, and Sensory’s Low Power Sound Detector (LPSD) technology, the EOS system enables always-on voice triggering and recognition while consuming less than 350 microAmps.

“It solves the problem of doing voice recognition at low power,” says Faith.

EOS has 2,800 effective logic cells of in-system reprogrammable logic that can be used for an additional FFE or customer-specific hardware differentiated features.

The EOS S3 platform and QuickLogic’s SenseMe library are compliant with Android Lollipop (5.0+) as well as various RTOSes.

Since the platform is sensor and algorithm agnostic, it can support third party and customer-developed algorithms through QuickLogic’s industry-standard Eclipse Integrated Development Environment (IDE) plugin.

The IDE provides optimised and proven code generation tools as well as a feature-rich debugging environment to ensure quick porting of existing code into both the FFE and the ARM M4F MCU of the EOS S3 platform.

Applications include:

  • Always-on, always-listening voice recognition and triggering
  • Pedometry, pedestrian dead reckoning, and indoor navigation
  • Sports and activity monitoring
  • Biological and environmental sensor applications
  • Sensor fusion including gestures and context awareness
  • Augmented reality
  • Gaming

Processor Cores

  • 180DMIPS of aggregate processing capability
  • 578KB of aggregate SRAM for code and data storage

QuickLogic Proprietary microDSP Flexible Fusion Engine

  • 50KB SRAM for Code
  • 16KB SRAM for Data
  • Very long instruction word (VLIW) microDSP architecture
  • 50µW/MHz
  • As low as 12.5µW/DMIPS

ARM Cortex M4F

  • Up to 80MHz
  • Up to 512KB SRAM
  • 32-bit, includes floating point unit
  • 100µW/MHz; ~80µW/DMIPS

Programmable Logic

  • 2,800 effective logic cells
  • Capable of implementing an additional FFE and customer-specific functionality

Package Configurations

  • Ball grid array (BGA)
  • 3.5×3.5×0.8mm, 0.40mm ball pitch
  • 49-ball, 34-user I/O’s

Wafer Level Chip Scale Package (WLCSP)

  • 2.5×2.3×0.7mm, 0.35mm ball pitch
  • 36-ball, 28-user I/O’s

Integrated Voice

  • Always-on voice trigger and phrase recognition capability, in conjunction with sensory
  • I2S and PDM microphone input with support for mono and stereo configurations
  • Integrated hardware PDM to PCM conversion
  • Sensory low power sound detector (LPSD)

Interface Support

  • To host – SPI slave
  • To sensors and peripherals – SPI master (2X), I2C, UART
  • To microphones – PDM and I2S

Additional Components

  • ADC
  • 12-bit sigma delta
  • Regulator – low drop 0ut (LDO), with 1.8V to 3.6V input support
  • System clock – integrated 32kHz and high speed oscillator

david manners

QuickLogic introduces multi-core EOS sensor hub

EOS - QuickLogic Flexible Fusion Engine S2-Sensor Hub Block-Diagram

EOS – QuickLogic Flexible Fusion Engine S2-Sensor Hub Block-Diagram

QuickLogic has introduced a triple core sensor hub called EOS.

The justification for sensor hubs is that: “It is power-prohibitive to do sensor processing on the host processor,” according to QuickLogic vice-president Brian Faith.

The three cores are: an ARM Cortex M4F MCU, a front-end sensor manager and a QuickLogic proprietary core which it calls Flexible Fusion Engine (FFE). A fourth core could be integrated into the hub’s FPGA fabric.

The FFE and and sensor manager handle the bulk of the algorithm processing, which minimises the duty cycle for the floating point MCU.

This approach lowers aggregate power consumption, and enables mobile, wearable and IoT device designers to introduce next generation sensor-driven applications, such as pedestrian dead reckoning (PDR), indoor navigation, motion compensated heart rate monitoring, and other advanced biological applications within their power budgets.

The EOS platform includes a hardened subsystem specifically designed for always-listening voice applications.

With its dedicated PDM-to-PCM conversion block, and Sensory’s Low Power Sound Detector (LPSD) technology, the EOS system enables always-on voice triggering and recognition while consuming less than 350 microAmps.

“It solves the problem of doing voice recognition at low power,” says Faith.

EOS has 2,800 effective logic cells of in-system reprogrammable logic that can be used for an additional FFE or customer-specific hardware differentiated features.

The EOS S3 platform and QuickLogic’s SenseMe library are compliant with Android Lollipop (5.0+) as well as various RTOSes.

Since the platform is sensor and algorithm agnostic, it can support third party and customer-developed algorithms through QuickLogic’s industry-standard Eclipse Integrated Development Environment (IDE) plugin.

The IDE provides optimised and proven code generation tools as well as a feature-rich debugging environment to ensure quick porting of existing code into both the FFE and the ARM M4F MCU of the EOS S3 platform.

Applications include:

  • Always-on, always-listening voice recognition and triggering
  • Pedometry, pedestrian dead reckoning, and indoor navigation
  • Sports and activity monitoring
  • Biological and environmental sensor applications
  • Sensor fusion including gestures and context awareness
  • Augmented reality
  • Gaming

Processor Cores

  • 180DMIPS of aggregate processing capability
  • 578KB of aggregate SRAM for code and data storage

QuickLogic Proprietary microDSP Flexible Fusion Engine

  • 50KB SRAM for Code
  • 16KB SRAM for Data
  • Very long instruction word (VLIW) microDSP architecture
  • 50µW/MHz
  • As low as 12.5µW/DMIPS

ARM Cortex M4F

  • Up to 80MHz
  • Up to 512KB SRAM
  • 32-bit, includes floating point unit
  • 100µW/MHz; ~80µW/DMIPS

Programmable Logic

  • 2,800 effective logic cells
  • Capable of implementing an additional FFE and customer-specific functionality

Package Configurations

  • Ball grid array (BGA)
  • 3.5×3.5×0.8mm, 0.40mm ball pitch
  • 49-ball, 34-user I/O’s

Wafer Level Chip Scale Package (WLCSP)

  • 2.5×2.3×0.7mm, 0.35mm ball pitch
  • 36-ball, 28-user I/O’s

Integrated Voice

  • Always-on voice trigger and phrase recognition capability, in conjunction with sensory
  • I2S and PDM microphone input with support for mono and stereo configurations
  • Integrated hardware PDM to PCM conversion
  • Sensory low power sound detector (LPSD)

Interface Support

  • To host – SPI slave
  • To sensors and peripherals – SPI master (2X), I2C, UART
  • To microphones – PDM and I2S

Additional Components

  • ADC
  • 12-bit sigma delta
  • Regulator – low drop 0ut (LDO), with 1.8V to 3.6V input support
  • System clock – integrated 32kHz and high speed oscillator

david manners

Intel, Micron plan crosspoint memory

Micron to ship TLC NAND flash this year

Micron to ship TLC NAND flash this year

Intel and Micron plan to have samples of a 128Gbit 3D crosspoint memory by the end of the year with commercial shipments in 2016.

Intel claims that the new memory, which it calls XPoint, writes ‘up to 1000 faster than NAND’ and has 1000 times the endurance of NAND.

The memory has a cross point array structure described, in the press release, as a “3D checkerboard where memory cells sit at the intersection of word lines and bit lines, allowing the cells to be addressed individually. As a result, data can be written and read in small sizes, leading to faster and more efficient read/write processes.”

The release adds:

“More details about 3D XPoint technology include:

Cross Point Array Structure – Perpendicular conductors connect 128 billion densely packed memory cells. Each memory cell stores a single bit of data. This compact structure results in high performance and high-density bits.

Stackable – In addition to the tight cross point array structure, memory cells are stacked in multiple layers. The initial technology stores 128Gb per die across two memory layers. Future generations of this technology can increase the number of memory layers, in addition to traditional lithographic pitch scaling, further improving system capacities.

Selector – Memory cells are accessed and written or read by varying the amount of voltage sent to each selector. This eliminates the need for transistors, increasing capacity while reducing cost.”

david manners

Intel, Micron plan crosspoint memory

Micron to ship TLC NAND flash this year

Micron to ship TLC NAND flash this year

Intel and Micron plan to have samples of a 128Gbit 3D crosspoint memory by the end of the year with commercial shipments in 2016.

Intel claims that the new memory, which it calls XPoint, writes ‘up to 1000 faster than NAND’ and has 1000 times the endurance of NAND.

The memory has a cross point array structure described, in the press release, as a “3D checkerboard where memory cells sit at the intersection of word lines and bit lines, allowing the cells to be addressed individually. As a result, data can be written and read in small sizes, leading to faster and more efficient read/write processes.”

The release adds:

“More details about 3D XPoint technology include:

Cross Point Array Structure – Perpendicular conductors connect 128 billion densely packed memory cells. Each memory cell stores a single bit of data. This compact structure results in high performance and high-density bits.

Stackable – In addition to the tight cross point array structure, memory cells are stacked in multiple layers. The initial technology stores 128Gb per die across two memory layers. Future generations of this technology can increase the number of memory layers, in addition to traditional lithographic pitch scaling, further improving system capacities.

Selector – Memory cells are accessed and written or read by varying the amount of voltage sent to each selector. This eliminates the need for transistors, increasing capacity while reducing cost.”

david manners

Wize Mirror reads back vital signs

Wize Mirror

Wize Mirror

Mirror mirror on the wall, am I at risk of heart disease? One day soon your mirror might actually be able to give you the answer.

Wize Mirror looks like a mirror, but incorporates 3D scanners, multispectral cameras and gas sensors to assess the health of someone looking into it. It does this by examining the person’s face, looking at fatty tissue, facial expressions and how flushed or pale they are.

Facial recognition software looks for telltale markers of stress or anxiety, while the gas sensors take samples of the user’s breath looking for compounds that give an indication of how much they drink or smoke. The 3D scanners analyse face shape to spot weight gain or loss, while the multispectral cameras can estimate heart rate or haemoglobin levels.

After the software has analysed the face – which only takes about a minute – the mirror produces a score that tells the user how healthy they seem. It also displays personalised advice on how to improve their health.

Wize Mirror is being developed by a consortium of researchers and industry partners from seven European Union countries, with EU funding. Sara Colantonio and colleagues from the National Research Council of Italy, which coordinates the project, want to use Wize Mirror to address common long-term health issues that are difficult to treat once something has already gone wrong, like heart disease or diabetes.

“Prevention is the most viable approach to reduce the socio-economic burden of chronic and widespread diseases, such as cardiovascular and metabolic diseases,” they write.

Clinical trials of the device will begin next year at three sites in France and Italy, aiming to compare its readings with those from traditional medical devices.

Facing the consumer

Consumer technology that can read signals from the body to interpret underlying physical and mental health is on the cusp of becoming part of everyday life. For example, Cardiio, originally developed at the Massachusetts Institute of Technology, is an app that uses a smartphone’s camera to monitor blood levels in the face and tell you your heart rate.

At MIT’s Media Lab, Javier Hernandez has looked at using mirrors for health monitoring. He also developed a program called SenseGlass, which uses Google Glass and other wearables to measure someone’s mood and help them manage emotions.

Hernandez says that although mirrors are great for health monitoring because we use them every day, putting them to use in this way is trickier than it sounds. “Accurate health assessments in natural settings are quite challenging due to many factors such as illumination changes, occlusions and excessive motion,” he says.

Journal reference: Biosystems Engineering, DOI: 10.1016/j.biosystemseng.2015.06.008

Alun Williams

Touch displays need extra EMI protection, says supplier

Touch displays need extra EMI protection, says supplier

Touch displays need extra EMI protection, says supplier

Capacitive touch screen panels have their own particular problems with being susceptible to electromagnetic interference (EMI)

EMI coupling for capacitive touch screens can be a problem which  can result in the touch screen assembly being affected by external noise and emissions from the rest of the system.

As well as the system’s power and data processing functions, the display and associated electronics which may also emit EMI.

To address the issues of EMI in capacitive touch panels, display module supplier andersDX is now using customised driver ICs that have EMI specific filter circuits.

Mike Logan, display and input technology manager at andersDX said:

“Engineers understand that EMC needs to be tackled from the ground up. Capacitive touch screens are sensitive to EMI and protection can’t be added as an afterthought.”

These techniques will need to be applied on a case by case basis taking into account multiple factors and in particular the environment and the design of the host product.

“There is no one-size-fits-all solution here. We will work with customers from initial concept right through to qualification, to ensure that the display works reliably in their application, doesn’t interfere with other systems in the environment and complies with the required standards,” said Logan.

 

Richard Wilson