Somerville – Techweek https://techweek.com Fri, 07 Dec 2018 15:03:46 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 Perceptive Automata – Making Autonomous Vehicles Intuit https://techweek.com/perceptive-automata-autonomous-vehicles/ https://techweek.com/perceptive-automata-autonomous-vehicles/#respond Wed, 31 Oct 2018 09:30:05 +0000 https://techweek.com//uncategorized/https-techweek-com-perceptive-automata-making-autonomous-vehicles-intuit/ How do we get autonomous vehicles to be more ‘human’? Perceptive Automata, a Somerville, Massachusetts based startup, has developed a solution that can help autonomous vehicles intuit human behaviour, ensuring safer deployment. It all started with an experiment. The Experiment In early 2016, Sam Anthony, CTO and Co-founder of Perceptive Automata, placed a GoPro camera […]

The post Perceptive Automata – Making Autonomous Vehicles Intuit appeared first on Techweek.

]]>
How do we get autonomous vehicles to be more ‘human’? Perceptive Automata, a Somerville, Massachusetts based startup, has developed a solution that can help autonomous vehicles intuit human behaviour, ensuring safer deployment. It all started with an experiment.

The Experiment

In early 2016, Sam Anthony, CTO and Co-founder of Perceptive Automata, placed a GoPro camera at an intersection to answer one simple question – “while driving, how often do people look at another person and attempt to understand and react to what that person is thinking?”.

Upon analyzing the videos, he realized people are constantly making subconscious judgements about others actions. In a 30-second segment, he found over 50 instances of such subtle communications. For example, a scooter at the intersection wants to drive through but is waiting to see if the oncoming cars will let him pass. Another vehicle comes along and wants to turn. The scooter moves back in response to seeing and is signalling that he recognizes the other person wants to turn and needs space.

Machines lack this ability. When they detect something in the surroundings, they slow down or come to a complete stop. This could be problematic as it may not be the right response. The inability to properly understand its surroundings has resulted in accidents. A classic example of this is Google’s first crash back in 2016.

Team PA

Team Perceptive Automata (PC: Perceptive Automata)

 

Where’s the Autonomous Vehicle?

There are six levels of driving automation ranging from no driving automation (level 0) to full driving automation (level 5) as defined by SAE International. Our understanding of autonomous cars from pop culture falls in the level 4 (high driving automation) and level 5 categories.

Growing at a CAGR of 39.5%, the global autonomous vehicles market is poised to reach $557Bn by 2026. But things might get delayed. The recent Gartner Hype Cycle places level 4 in the early stages of the ‘trough of disillusionment’ and level 5 at half way through the ‘innovation trigger’. While experts may have diverging estimates ranging from a few years to a few decades on when autonomous vehicles will hit the road, it is safe to say it won’t be anytime soon.

Auto tech continues to be a highly anticipated area with autonomous vehicles startups getting most of the investment. Thus far in 2018, we’ve already seen over 116 deals with a total of $5Bn in funding. Money pours into startups that are helping solve the many challenges on the road to safe deployment. There are many challenges such as mapping, trajectory optimization, scene perception that remain to be solved before we see safe deployment of autonomous vehicles. Higher-level planning decisions is also a challenge and requires modelling human intuition.

Machines vs Humans

In a paper published by University of Michigan Transportation Research Institute (UMTRI), Brandon Schoettle finds that machines are well suited to tasks that require a quick reaction time or consistency or processing lots of information from different sources. However, when it comes to reasoning and perception, humans have an advantage. It is an intense process for autonomous vehicle to reach this level.

There is a fundamental issue with the traditional methods of training machine learning models. The traditional paradigm is pretty straightforward in terms of input and output. There are images based on which the system is trained and the ‘goodness’ of the model is based on its performance on a sample taken from the data set and is measured against a benchmark. How do you know if the model has truly learned the principles that guide decisions or it is simply matching using memory?

While describing the problem with the traditional models in a paper on visual recognition, the co-founders touch upon the classic Chinese Room problem proposed by the philosopher John Searle, where a man who doesn’t speak Chinese is locked up in a box with a book/set of instructions. His task is to generate Chinese characters in response to a message slipped in (input) while relying on the book. The person receiving the final result (output) may think that the person inside the box knows Chinese, but this isn’t the case. This has real life consequences as when the system is put on road, it is important that the system truly understands its surroundings.

As one of the major deployment areas for autonomous vehicles is densely crowded urban areas, it is important for autonomous vehicles to be in sync with humans.  As seen from Sam’s experience, we find instances of non-verbal communication between the drivers and pedestrians or other drivers which are intuitive and processed at a subconscious level. Is that person going to cross the road or will the person wait? Such problems require a more profound understanding of human behaviour rather than simply identifying objects in the surroundings.

The Perceptive Automata Solution

Perceptive Automata’s solution is based on visual psychophysics, a branch of psychology that deals with the relationship between stimuli and behavioural responses, as a paradigm to develop computer vision. This method helps quantify, measure and incorporate behavioural aspects into these models.

As opposed to the traditional model, this approach relies on the use of ‘perturbation’ which distorts the image using blurs and rotations and then requires the model to identify the image. In another paper, the co-founders apply a similar approach to facial recognition while making use of facial expressions and distortions such as contrasts to check accuracy. By comparing the model’s results with human results, the papers demonstrate areas where models are truly ‘super human’ and where human are better.

 Perceptive imagines self-driving cars which can predict behavior

Based on these models (and more), Perceptive Automata’s highly sophisticated algorithms have a richer understanding of their surroundings. Below are a few graphics that help illustrate their approach. The green eye signals that the person is aware of the vehicle’s presence or has seen the vehicle while the redness of the bar indicates the intent to move. When the ‘eye’ is covered by the red circle and line, it indicates that the person isn’t aware while the white bar indicates no intent to move. These features help the vehicle understand and embed itself in its environment more easily.

BeFunky-collage Shots

Algorithm visualizations (PC: Perceptive Automata)

 

The Perceptive Future

The four-year-old startup was co-founded by Sid Misra (CEO), Dr. Sam Anthony (CTO), Avery Faller (Senior Machine Learning Engineer), Dr. David Cox and Dr. Walter Scheirer (advisors continuing their academic work). The team is comprised of Harvard, MIT, and Stanford neuroscientists and an AI expert. The idea was conceived while the co-founders were at Harvard. Perceptive Automata was in stealth mode for a very long time, until July this year when the team unveiled their work. They recently raised $16M in series A funding led by Jazz Venture Partners with participation from Hyundai Motor Company, Toyota AI Ventures and existing investors First Round Capital and Slow Ventures. Having raised $3 M from First Round Capital and Slow Ventures last year and received grants to the tune of $1 M, Perceptive Automata has raised $20 M thus far. Their immediate focus now is to grow the product development and customer implementation teams while also bringing on more top engineers to further refine the software.

While there are numerous players spread across the landscape focusing on solving different challenges, the broader market can be looked at from the perspective of full stack developers (who work on everything from the idea to the final product) and ‘others’. Perceptive Automata competes with in-house teams at Waymo, Uber and other full stack developers while at the same time, being a potential solution provider to these firms and the ‘others’. Humanizing Autonomy, a London based company working on developing a “pedestrian intent prediction platform”, is one of the few  known startups working on incorporating intentions and behavioural sciences. There may be others still in stealth mode. By solving one of the most pressing challenges in the deployment of safe autonomous vehicles, Perceptive Automata is poised to emerge as one of the key players in the landscape.

BeFunky-collage Founders

The co-founders (R-L) Dr. Sam Anthony (CTO), Sid Misra (CEO) Avery Faller, Dr. David Cox and Dr. Walter Scheirer (PC: Perceptive Automata)

The post Perceptive Automata – Making Autonomous Vehicles Intuit appeared first on Techweek.

]]>
https://techweek.com/perceptive-automata-autonomous-vehicles/feed/ 0
Formlabs’ 3D Printer – The Business Of Printing The Future https://techweek.com/3d-printer-startup-formlabs-boston/ https://techweek.com/3d-printer-startup-formlabs-boston/#respond Thu, 31 May 2018 09:11:03 +0000 https://techweek.com//uncategorized/https-techweek-com-3d-printer-startup-formlabs-boston/ In the early 2000s, engineers, designers, and creativists were obsessing over the immediate, printable future. They would use open source to build 3D printers and throw buzzwords like changing humanity around. The big idea was 3D printing (a collection of processes where layers of material is joined to create a three-dimensional object from a digital […]

The post Formlabs’ 3D Printer – The Business Of Printing The Future appeared first on Techweek.

]]>
In the early 2000s, engineers, designers, and creativists were obsessing over the immediate, printable future. They would use open source to build 3D printers and throw buzzwords like changing humanity around. The big idea was 3D printing (a collection of processes where layers of material is joined to create a three-dimensional object from a digital file) and depending on what one was reading, it either felt futile or as if the future was hidden in a cartridge.

Once the market saw smaller and reliable 3D printers, media gobbled the innovation and optimistically shared the following analogy: Remember how everyone thought computers would only be used by nerds but then they were splashed around and blinking in almost every home? 3D printers, too, would be the next computers, the next microwaves, or the next-anything found in every modern home.

But nearly two decades later, that future is still evasive.

Innovators like Bree Pettis of Makerbot Printers – the first company to sell the vision of desktop 3D printers – have moved on, and a range of analysts have dissected their trajectory to conclude the one monumental mistake they made: the market for 3D printers is businesses and colleges, not individuals. Not yet.

But as Makerbot was hoarding desk and mindspace, hoping you’d buy a 3D printer, a new Massachusetts-based startup, Formlabs was getting ready for launch.

15894925_1179732902094613_7857862528414469549_n

Formlabs 3D Printers (PC: Formlabs Facebook Page)

Started in 2011 by three MIT engineers and designers, this 3D printing company seems to have got something right.

With a roster of clients such as Sony, Google, the Harvard University, and others, the company has, first and foremost, aptly positioned itself as a 3D printing solution for “professionals”. Last month, it raised $30 million in a Series C funding led by Tyche Partners, which puts its total funding north of $85 million.

Formlabs’ secret? It’s a little technical. Startups before Formlabs used the FDM technology which builds parts layer-by-layer by heating and throwing out something called thermoplastic filament. Formlabs is credited as the first company to launch smaller, more affordable 3D printers using the stereolithography or SLA technology.

What the SLA technology does is that it converts liquid resin (used as cartridge) into solid parts, layer by layer, by selectively curing them using a light source in a process called photopolymerization. SLA is more effective than FDM in creating intricate designs and is used to create models or prototypes for dentistry, jewelry, toys, footwear, education, among others.

Once the three MIT students: Maxim Lobovsky, Natan Linder, and David Cranor envisioned this, they raised money for their first SLA-based Form 1 3D printer through a Kickstarter campaign. People were clearly in the future-hidden-in-cartridge camp as the  campaign raised a record breaking $2.95 million and made the Form 1 one of the most highly funded projects of all time.

Since then, the company has gone on to build faster and more reliable printers such as Form 1+, Form 2, Fuse 1, and an industrial system called Form Cell. The Form 2, its most popular product, costs more than $3,000.

3D Printing For All

For the ideological and prescient 3D printing community, this technology is not about printing trinkets and plastic action figures, but building things that reduce your trips to the store and more importantly, bring value. Formlabs, too, sees 3D printing as the future of customisation and a challenger to mass production.

For example, one of Formlab’s clients is Egf Manufacturur, a German wedding ring manufacturer, that uses the startup’s Form 2 desktop 3D printer. They configure a wedding ring in CAD software, print it in 3D, and give it to the customer before the ring is hand-crafted by the goldsmith. Reportedly, Formlabs CEO Max Lobovsky also proposed to his fiancée through a 3D printed ring.

10615376_713147845419790_7664007189676723189_n

A 3D printed: Before and after (PC: Formlabs Facebook Page)

But from knowing how to 3D design to ensuring that the printer works smoothly without getting jammed or spewing out a crooked product, mass adoption of 3D printing is laden with challenges. For Egf Manufacturur, the Formlabs blog says, “The company customized their CAD software (A 3D printing design software) so that employees, seated next to the prospective customer, can easily make adjustments to pre-loaded engagement ring designs without formal CAD training.”

In another example, Formlabs has used the SLA technology to build custom earbuds. Formlab’s blog explains that a technician uses a 3D scanner to take a quick, non-intrusive digital scan your ear canal. S/he then edits the digital file into a 3D printable mold, and sends it wirelessly to the 3D printer. Once printed, the 3D printed shell is removed, and final finishing and coating is done on the final product.

custom-earbuds-3d-printing-steps

Custom earbuds are made by casting a biocompatible silicone in hollow molds printed in Formlabs Clear Resin. Each printed mold costs $0.40 to $0.60 in resin, and the overall production of a final pair of earbuds costs approximately $3 to $4 in raw materials (including silicone and lacquer). (PC & Caption: Formlabs)

Clearly, 3D printing has multiple uses. Apart from previously mentioned dentistry, jewelry, or audiology, 3D printing is used to build toys, eyeglass frames, footwear, or a range of prototypes as per industry requirements. Architects, for example, use the printed model to internally assess and communicate with clients. Similar processes are applied in varied industries that use 3D printers to print individual parts: Deutsche Bahn, for example, finds it difficult to manufacture parts of its 40-50 year-old trains, so it relies on 3D printers, as does General Electric.

However, 3D printers, like existing smart home appliances, are still not considered revolutionary. The reason is that printers may be smaller, speedier, and more reliable than before; they may also use far less labour, but 3D printing is not very simple. It’s not just a push of a button. It involves high printer and cartridge costs and an ample amount of tinkering and designing.

But with companies like Formlabs in the lead, there’s enough chatter about custom production as the future of 3D printing. Every once in a while, the discussion is also backed by studies. Like a recent ING Think report which notes: “Tentative calculations show that, if the current growth of investment in 3D printers continues, 50% of manufactured goods will be printed in 2060.”

In a wholly different way since the Gutenberg press, the future might just be printable.

The post Formlabs’ 3D Printer – The Business Of Printing The Future appeared first on Techweek.

]]>
https://techweek.com/3d-printer-startup-formlabs-boston/feed/ 0