Tuesday, November 10, 2015

Fwd: did you see this?

---------- Forwarded message ----------
From: William 


"... Toyota invest $3 billion in a new research and development unit to investigate the feasibility of robots becoming a key technology of the future. The money is to be invested over a five-year period to establish the Toyota Research Institute which will be situated near Stanford University in Silicon Valley...."

Fwd: A great use for virtual reality headsets.

Perhaps also hinting at possible virtual tourism in the very near future (Oculus Rift is set to be released in 2016).


This Robot Will Let Kids In Hospital Explore Zoos Through Virtual Reality

A community called "Robots for Good" has come together to help kids stuck in Great Ormond Street Hospital in London visit the zoo. If the name hasn't given it…



Wednesday, September 23, 2015

MicroPython - Python for microcontrollers


Fwd: [ARTIK] The first ARTIK Newsletter is here!

---------- Forwarded message ----------
From: Artie from Samsung ARTIK <artie@artik.io>
Date: Wed, Sep 23, 2015 at 5:11 PM
Subject: [ARTIK] The first ARTIK Newsletter is here!
To: John.sokol@gmail.com

Samsung ARTIK

Tiny. Powerful. Interconnected. Secure.

Build a better Internet Of Things with ARTIK™

Welcome to Fall! (Or Spring, depending on your hemisphere.) We're proud to introduce the inaugural ARTIK Newsletter. It's been a very busy time for the ARTIK team as we travel the Road to Beta. We have been busy shipping a limited number of Alpha versions of the ARTIK developer boards to labs, partners and our Alpha developers, and that's good news for our friends awaiting the Beta release – we're getting close! Our engineering team is working hard to finalize the developer boards and software, with the goal of shipping Beta boards by the end of November.

In the meantime, we hope you've been keeping up with the ARTIK and SAMI blogs. In one of our most recent ARTIK blog posts, we used ARTIK and SAMI, along with libraries from our partner Temboo, to show you how to create a simple weather station that reads the local temperature and writes the data to the SAMI cloud for near-real time access and later analysis.

We've also started our "IoT 101" blog series to help you get up to speed on concepts behind the architecture and design of IoT, and to help you make decisions around your IoT products. We started by covering basic connectivity and networking, with more to come. Keep an eye on the ARTIK blog for more of these tutorials, including an upcoming post on ideas for powering your IoT devices. If there are any topics you'd like to see covered in IoT 101 or elsewhere on our blogs, please let us know.

ARTIK at the Tizen Developer Conference

Recently, at the 2015 Tizen Developer Conference in Shenzhen, China, Tizen 3.0 was announced and demonstrated running on both ARTIK 5 and ARTIK 10. Curtis Sasaki, VP of Ecosystems, discussed the state of the IoT market, introduced the ARTIK platform, and also discussed Tizen on ARTIK and how SAMI is a key enabler.

Samsung VP MK Kang hosted a technical session on ARTIK to a packed room, and Dr. Luc Julia presented and demonstrated how easy it was to create a Tizen TV Manifest in SAMI, get real-time data, and have the Tizen TV interact with other devices already connected to SAMI once the Manifest was created.

Around 1,500 attended the conference. This was the first time the Tizen Developer Conference was held in Shenzhen, and it was breathtaking to see the rapid modernization of a large city and the move for Shenzhen to become a smart city. Needless to say, IoT and ARTIK drew a lot of excitement.


Accessing the ARTIK 10 GPIO

It's time to start thinking about how you will use ARTIK in your application. Have you checked out our GPIO guide yet? (You'll need to login to view the page.) If not, you'll find helpful examples and diagrams describing how to use the ARTIK 10's I/Os. We'll be updating this guide for ARTIK 1 and ARTIK 5 in the future.

img alt

Until next time...

We value your feedback! We are working to improve both the content and organization of our documentation so that you can hit the ground running when you receive your ARTIK Beta DevKits. Please get in touch if there is any specific information you'd like to see in the ARTIK documentation, or if there are blog topics, tutorials or IoT tips that can help you get started.

See you on the Road to Beta!

- From your friends at Samsung ARTIK


Check out the ARTIK Blog!

3655 North 1st Street San Jose, CA 95134

Twitter Facebook LinkedIn Instagram

Samsung Strategy and Innovation Center

Copyright © 1995-2015 SAMSUNG.

All rights reserved.

Unsubscribe Terms

EZ Robotics新品发布,消费机器人市场将掀起轩然大波


EZ Robotics new release, the consumer robotics market will set off an uproar

2015-09-22 TBOT EZRobotics

Follow us believe that the public number of friends to this day the long-awaited, and nothing can be more exciting than that not unveiled a new product it? Today we will live up to expectations.

The design from the beginning of March this year, today released the robot, in the end what kind of it? Small as you gradually revealed.

Before TBot appear, you may not know how long you can often accompanied by their parents in the next; TBot appear before, no matter if you want to accompany the child around, but can not, because of work and family is often difficult to co-existence, the more so the more outstanding. Once you have TBot, everything will change, you may be able to cross it, you can not cross the temporal and spatial distance.

EZ Robotics introduced the first robot named TBot, 1.2 meters tall and weighing 15kg, modular design and the overall robot brain-based iOS system, so TBot more playable than the average robot, versatility and scalability. IOS-based platform developed remote video intercom system, so you can easily chat with your family anytime, anywhere, so that accompany distance; 7x24 hour as a family member at home, TBot also through its powerful intelligent voice interaction App, broadcast news for you, play music, check the weather, human-computer chat is a great intelligent assistant.

TBot streamlined body gentle and graceful curve, contour lines Ling Li in another sleek, black and white colors and ice blue lights adorn it exude elegance glamorous sense of technology; but as attentive service robots, but people very Meng, how to push also push not fall oh, two-wheeled self-balancing technology Which is stronger? ! In the current field of service robots almost too hot foam, TBot it looks definitely be delightful, the beauties.

Not only is the appearance, the height is also quite particular about. Because TBot mainly for home users, from eight feet tall brawny, to less than one meter high and young children, are likely to be TBot interactive object, so we paid great attention to design versatility. And TBot two meters tall, coupled in the horizontal direction can freely rotate and tilt the head, no matter how much the user height, TBot head can find the most suitable angle, allowing users to get the best interactive experience and psychological acceptance .

TBot body using a modular design, the "brain", "head", "body" configuration, each module can be customized and replaced. This scalability allows the robot can be applied to different scenarios.

* Brain: the head is fixed in TBot iPad or iPad Mini;

* Head: The biggest highlight is the use of a separate head can be taken down, and horizontal and tilt two rotational degrees of freedom;

* Body: shape not only the United States, the future can be easily extended arm, projectors and other useful features;

* Landing gear: the most handsome and most body parts can provide a sense of security;

* Two-wheeled self-balancing chassis: Wheel diameter large by ability, hubless design science and technology lifestyle security is good;

* Charging base: streamlined appearance, supports automatic charging.

We believe TBot will become a great family to accompany the robot and the 7x24-hour home intelligent assistant, in addition, TBot also be used for remote offices, factory visits, distance education, telemedicine, banking and other operating room services, museums remote visit, even meals and other scenes, as we hoping as: TBot see the world on your behalf!

EZ Robotics team and that something TBot

EZ Robotics was founded in late 2014, is a focus on the development of cutting-edge technology company serving robot, team members from the Hong Kong University of Science and Technology, Tencent, Siemens, Lei Feng network, solid high-tech, the company CEO Zhang Tao, and other core team members have long 10 years of robot research and development or related experience, we believe that in the near future robots will spread to every home.

EZ Robotics Development TBot mind, because our team members like most people, are often unable to meet with their parents or children, so we wanted to grinding out a self-love, but also for the benefit of everyone's warm robots. In the long road home robot or service robot, TBot but we have taken a small step, EZ Robotics next generation of robots will have a major breakthrough in the mechanical structure, control and intelligence and other areas, we with you the same expectations.

Finally, we are not really looking forward to our TBot it? October 11, we will be held in Shenzhen new experience, more news, please note the official website and this public number!

EZ Robotics, Easy Life!

Read more

Saturday, September 5, 2015

Augmented Pixels: Indoor Navigation Platform for Drones

Drones are notoriously difficult to handle indoors: hard to control and prevent crashing into walls or people.

Augmented Pixels has been actively developing technology (including SLAM) to ensure safe flights as well as intuitive and easy navigation using Augmented Reality.

They came up with a platform that significantly reduces accident rates and minimizes the effect of "human factor". Moreover, it is possible to program the drone to fly around and land by itself.

The prospects for this technology include a wide range of use cases (e.g. inspection of premises for security, creation of 360-degree tours, etc.).Augmented Pixels is located in Palo Alto, CA. 

Tuesday, August 25, 2015

Fwd: Visual Intelligence Through Deep Learning; Open UC Santa Cruz Faculty Position; More

Sent from my iPad

Begin forwarded message:

From: "Embedded Vision Insights from the Embedded Vision Alliance" <newsletter@embeddedvisioninsights.com>
Date: August 25, 2015 at 10:53:20 PM GMT+8
To: john.sokol@gmail.com
Subject: Visual Intelligence Through Deep Learning; Open UC Santa Cruz Faculty Position; More

Embedded Vision Insights
embedded-vision.com embedded-vision.com
To view this newsletter online, please click here

Dear Colleague,Embedded Vision Summit

The Alliance continues to publish videos of great presentations from May's Embedded Vision Summit. Make sure you check out, for example, the highly rated keynote "Enabling Ubiquitous Visual Intelligence Through Deep Learning," by Dr. Ren Wu, formerly distinguished scientist at Baidu's Institute of Deep Learning (IDL). Dr. Wu shares an insider's perspective on the practical use of neural networks for vision.

In "Navigating the Vision API Jungle: Which API Should You Use and Why?", Neil Trevett, President of the Khronos Group, maps the landscape of APIs for vision software development. Long-time Alliance collaborator Gary Bradski, President of the OpenCV Foundation, provides an insider's perspective on the new version of OpenCV and how vision developers can utilize it in his presentation, "The OpenCV Open Source Computer Vision Library: Latest Developments." Also make sure to take a look at "Harman's Augmented Navigation Platform: The Convergence of ADAS and Navigation" from that company's Vice President of Technology Strategy, Alon Atsmon.

Roberto Mijat, Visual Computing Marketing Manager at ARM, explores when it makes sense to utilize a graphics core as a coprocessor in his presentation, "Understanding the Role of Integrated GPUs in Vision Applications." And echoing Dr. Wu's neural network focus, Jeff Gehlhaar, Vice President of Technology at Qualcomm, used his presentation "Deep-learning-based Visual Perception in Mobile and Embedded Devices: Opportunities and Challenges" to discuss the benefits, challenges and solutions for implementing neural networks in mobile and embedded devices. And the insights continued the next day at the quarterly Alliance Member Meeting: in "Combining Vision, Machine Learning and Natural Language Processing to Answer Everyday Questions," Faris Alqadah, CEO and Co-Founder of QM Scientific, explains how his firm is simplifying shopping for consumers by extracting and organizing product information from many data sources.

While you're on the Alliance website, make sure to check out all the other great content recently published there. And for timely notification of the publication of new content, subscribe to our RSS feed and Facebook, Google+, LinkedIn company and group, and Twitter social media channels. Thanks as always for your support of the Embedded Vision Alliance, and for your interest in and contributions to embedded vision technologies, products and applications. Please don't hesitate to let me know how the Alliance can better serve your needs.

Brian Dipert
Editor-In-Chief, Embedded Vision Alliance


"Embedded Lucas-Kanade Tracking: How It Works, How to Implement It, and How to Use It," a Presentation from Goksel Dedeoglu of PerceptonicPercepTonic
Goksel Dedeoglu, Ph.D., Founder and Lab Director of PercepTonic, presents the "Embedded Lucas-Kanade Tracking: How It Works, How to Implement It, and How to Use It" tutorial at the May 2014 Embedded Vision Summit. This tutorial is intended for technical audiences interested in learning about the Lucas-Kanade (LK) tracker, also known as the Kanade-Lucas-Tomasi (KLT) tracker. Invented in the early 80s, this method has been widely used to estimate pixel motion between two consecutive frames. Dedeoglu presents how the LK tracker works and discuss its advantages, limitations, and how to make it more robust and useful. Using DSP-optimized functions from TI's Vision Library (VLIB), he also shows how to detect feature points in real-time and track them from one frame to the next using the LK algorithm. He demonstrates this on Texas Instruments' C6678 Keystone DSP, where he detects and tracks thousands of Harris corner features in 1080p HD resolution video.

Introduction to the Embedded Vision Opportunity and the Embedded Vision Alliance CommunityEmbedded Vision Alliance
Jeff Bier, Founder of the Embedded Vision Alliance and President and Co-Founder of BDTI, presents the introductory remarks at the May 2015 Embedded Vision Summit. Jeff provides an overview of the embedded vision market opportunity, challenges, solutions and trends. He also introduces the Embedded Vision Alliance and the resources it offers for both product creators and potential members, and reviews the event agenda and other logistics.

More Videos


Neural Network Processors: Has Their Time Come?BDTI
Lately, neural network algorithms have been gaining prominence in computer vision and other fields where there's a need to extract insights based on ambiguous data. Classical computer vision algorithms typically attempt to identify objects by first detecting small features, then finding collections of these small features to identify larger features, and then reasoning about these larger features to deduce the presence and location of an object of interest (such as a face). These approaches can work well when the objects of interest are fairly uniform and the imaging conditions are favorable, but they often struggle when conditions are more challenging. An alternative approach, convolutional neural networks ("CNNs"), massively parallel algorithms made up of layers computation nodes, have been showing impressive results on these more challenging problems. More

Facebook Oculus Acquires Pebbles Interfaces for Gesture ControlTouch Display Research
Last month, Facebook's subsidiary Oculus reported that it had acquired Israel-based Pebbles Interfaces. Based in Israel, Pebbles Interfaces has spent the past five years developing technology that uses custom optics, sensor systems and algorithms to detect and track hand movement. Pebbles Interfaces will be joining the hardware engineering and computer vision teams at Oculus to help advance virtual reality, tracking, and human-computer interactions. More

More Articles


Faculty Position Open at UC Santa Cruz

More Community Discussions


Upcoming Free Qualcomm Vuforia Webinar Discusses Enabling Mobile Apps to See

Intel Expands Developer Opportunities as Computing Expands Across All Areas of Peoples' Lives

Altera Launches Worldwide SoC FPGA Developers Forums

ON Semiconductor Introduces Series of Advanced Image Co-Processors for Next Generation Automotive Camera

More News

About This E-Mail
Embedded Vision Insights respects your time and privacy. If you are not interested in receiving future newsletters on this subject, please reply to this message with the word "UNSUBSCRIBE" in the Subject area.

LETTERS AND COMMENTS TO THE EDITOR: Letters and comments can be directed to the editor, Brian Dipert, at editor@embeddedvisioninsights.com.

PASS IT ON...Feel free to forward this newsletter to your colleagues. If this newsletter was forwarded to you and you would like to receive it regularly, click here to register.
Sent to:john.sokol@gmail.com
If you prefer not to receive
future e-mails of this type,
click here
Sent By:
1646 N. California Blvd., Suite 220
Walnut Creek California 94596
United States
To view as a web page click here.

Sunday, August 23, 2015

Fwd: Invitation: Let's build a humanoid robot with computer vision

---------- Forwarded message ----------
From: Hack A Robot <info@meetup.com>
Date: Saturday, August 22, 2015
Subject: Invitation: Let's build a humanoid robot with computer vision
To: john.sokol@gmail.com

New Meetup
Hack A Robot
Added by Thomas Lee
Thursday, August 27, 2015
6:30 PM
South bay area
South bay area, CA
Are you going?
(Venue of this event is TBD.  Looking for suggestions/ offers to host this event as well) Earlier this year, I created Hackabot Nano (a feature-rich Arduino compatible wheeled robot). It was crowdfunded through Kickstarter. The robot was p...
Learn more

Follow us!

Never miss a last-minute change. Get the app.

iPhone App Store Google Play

Unsubscribe from similar emails from this Meetup Group

Add info@meetup.com to your address book to receive all Meetup emails

Meetup, POB 4668 #37895 NY NY USA 10163

Meetup HQ in NYC is hiring! meetup.com/jobs

Monday, August 17, 2015

Taking Atlas For a Walk

Remember Atlas, the humanoid robot from Boston Dynamics? The company bought by Google, er, owned by Alphabet, and uh, most likely to become Skynet? Well -- they've just shown us that Atlas can take a ...

Sunday, August 2, 2015

Robot stereo vision

From http://smprobotics.com/

They make an Unmanned Ground Video Surveillance Vehicle based on this.

Thursday, June 18, 2015

Fwd: the difference between Servo Motor and Stepper Motor

---------- Forwarded message ----------
From: yumiko <yumiko@thunderlaser.com>
Date: Thursday, June 18, 2015
Subject: the difference between Servo Motor and Stepper Motor
Hi John Sokol,
Many customers ask us this question: what's difference between Servo Motor and Stepper Motor, and why should we choose Servo Motor. Today I will answer this question.
Stepper Motor is device for discrete movement, in the current digital control system, the application of the Stepper Motor is very extensive. With the advent of all-digital AC Servo system, AC Servo Motors are increasingly used in digital control system. 
Servo motor in the absence of the control voltage, the stator only has pulsating magnetic field generated by the field winding, the rotor is stationary. When it has a control voltage, they produce a rotating magnetic field within the stator, usually under the constant load, the rotational speed of the motor changes in the size varies, when the control voltage of the opposite phase, the servo motor will be reversed.
Compared with Stepper Motor, Servo Motor has these advantages: 
1. Accuracy: to achieve a closed-loop control of position, velocity and torque; overcome the problem of stepper motor falling out step.
2. Speed: high-speed performance, generally rated speed can reach 2,000 to 3,000 rpm;
3. Adaptability: strong anti-overload, able to withstand three times the rated torque load, to have instantaneous load fluctuations and require fast start especially for the occasion;
4. Stability: stable low-speed operation, suitable for high-speed response requirements of the occasion;
5. Timeliness: Motor acceleration and deceleration of the dynamic response time is short, usually within a few tens of milliseconds;
6. Comfort: heat and noise significantly reduced.

have a nice day! 

Best Regards 


Sales Department 

Tel: +86 769 8266 5376 
Mobile: +86 150 2409 6180 
Skype: thunderlaser
Website: www.thunderlaser.com 
Facebook: www.facebook.com/thunderLaser 
Add: No.197,Shatian Town,Dongguan City, China 

Friday, May 22, 2015

Fwd: A Message from Robert de Neve - E Systems Has Just Opened It's Doors to DIY's, Maker's and Internet of Things Entrepreneurs

---------- Forwarded message ----------
From: Peter Gise <PGise@esystemstechnology.com>
Date: Thursday, May 21, 2015
Subject: A Message from Robert de Neve - E Systems Has Just Opened It's Doors to DIY's, Maker's and Internet of Things Entrepreneurs
To: Peter Gise <PGise@esystemstechnology.com>


E Systems a World Leader in Product Life Cycle - Based Manufacturing (PLM) Technology has just opened its doors to DIY's, Maker's and Internet of Things (IOT) Entrepreneurs!


20 May 2015


Dear Fellow Entrepreneurs,


As part of our collaborative Product Development & Manufacturing Business Model we are able to offer immediate occupancy at our modern Silicon Valley Technology Center for hardware and software prototyping, development and high-volume manufacturing.


Occupancy at the facility includes:


·          NO LEASE month-to-month cubicle space rentals starting from $500 per month

·          Hatchery Cells for hardware prototyping, development and process design

·          Factory Cells for production ramp and high-volume production

·          Modern office and lab environment for software development, prototyping and testing

·          Conference rooms, teleconferencing and telepresence robots

·          Expertise of our OEM team for product guidance and development including:

o         Product Life Cycle (PLC) Methodology

o         Design for X (DFX) Protocols

o         Accelerating Time to Profit

o         PLC Management

o         Productization







This new, modern facility is located in the heart of Silicon Valley at 6341 San Ignacio Ave., San Jose, California (650-961-0671).


If you or someone you know are interested in learning more about these and our other professional services, feel free to contact us to schedule a visit, or a telepresence robotics tour of the new Silicon Valley Technology Center.




Robert de Neve

President & CEO

E Systems Technology

c: 408-691-2381



Fukushima Robot Dies Three Hours After Entering Radioactive Reactor Vessel / Sputnik International