This blog was brought to you by Builder Nation, the community of Hardware leaders developing world-changing products, and sponsored by ControlHub, the purchasing software for hardware companies.
The world has seen rapid technological advancements in the past few decades, particularly in robotics. The robotics industry has revolutionized how we live and work and has become an integral part of modern society. With the industry’s growth, it has become increasingly important to understand its impact and potential for the future.
Companies like Cogniteam are essential as they provide innovative and cutting-edge solutions that help businesses automate their operations, optimize their processes, and improve their overall performance.
Robotics Industry
The robotics industry has a long-standing history, dating back several decades to the introduction of the first industrial robot in the 1950s. However, in recent years, the industry has experienced remarkable growth. The market size is projected to soar to $149,866.4 million by 2030, with an impressive annual growth rate (CAGR) of 27.7%.
These programmable machines can perform tasks autonomously or with minimal human intervention. They have transformed manufacturing, healthcare, and agriculture industries, making processes more efficient and cost-effective.
Robotics has also allowed for advancements in space exploration and has been used in military and defense applications. Trends and improvements in the industry include the use of collaborative robots, artificial intelligence, or cobots, which work alongside humans to perform tasks, allowing for greater autonomy in robotic systems.
The expanding robotics industry has numerous applications across various fields. For instance, industrial robotics has transformed manufacturing processes, improving efficiency and precision in assembly lines. Collaborative robots, or cobots, have also become increasingly common, allowing robots to work alongside humans to perform tasks such as assembly, packaging, and inspection.
In the healthcare industry, robots are used for various applications, including surgical procedures, patient care, and rehabilitation. Telepresence robots have enabled doctors to communicate with patients remotely, improving access to healthcare.
Additionally, robotics in agriculture and farming has led to greater efficiency and precision in farming operations. Robots are used for planting, harvesting, and irrigation, while drones are used for crop monitoring and analysis.
The military and defense industry has also embraced robotics for various tasks such as surveillance, bomb disposal, and reconnaissance, performing dangerous tasks that would be too risky for humans.
Robots have been essential in exploring distant planets and celestial bodies in space exploration. They provide valuable data about them, designed to operate in low-gravity environments and withstand extreme temperatures and radiation.
Artificial intelligence (AI) advancements have been one of the most significant trends in industrial robotics. AI algorithms allow robots to learn and adapt to their environment, improving their performance and reducing errors.
Sectors
Cogniteam operates in various industries and sectors, providing tailored solutions for each.
Here are some of the sectors:
Agriculture: the agricultural solutions help farmers optimize their operations and improve their crop yields. Using advanced technologies such as machine learning and computer vision, farmers can be assisted in monitoring their crops, identifying potential issues, and taking corrective action before problems occur.
Delivery: these solutions allow businesses to optimize their delivery processes, improve their delivery times, and reduce their costs, to streamline their operations and offer faster and more efficient delivery services.
Real Estate: allows businesses to optimize their real estate operations, improve their efficiency, and reduce their costs. Cogniteam’s real estate solutions can help businesses to automate their processes, monitor their properties, and make data-driven decisions about their real estate portfolios.
A Look into the Future
Looking into the future, the robotics industry will continue transforming our lives. Cogniteam is at the forefront of this industry, providing innovative solutions for businesses to manage their robotic fleets easily.
As the industry evolves, we expect to see the continued growth of collaborative robots and significant advancements in healthcare and agriculture. Moreover, AI and machine learning, swarm robotics, and human-robot interaction are areas of focus for future research and development.
The future of the robotics industry is bright, and Cogniteam is leading the way with cutting-edge solutions.
Book a demo and start your way to the finish line.
Absolutely! In today’s world, the lifecycle of a robotic start–up company usually takes 6 years to get to market, while for a software company it takes a year and a half or two. Robotic start-up companies start with a small prototype, and will grow directly to production and sales. So they think…
Cogniteam’s Co-Founder and CEO, Dr. Yehuda Elmaliah was invited to speak at DAI 2021 virtual conference, Shanghai, China. The aim of the Distributed Artificial Intelligence (DAI) conference is bringing together researchers and practitioners in related areas to provide a single, high-profile, internationally renowned forum for research in the theory and practice of distributed AI, for more than 1,000 online participants.
Dr. Elmaliah was born in the ’70s and spent his childhood in the ’80s, having pop-culture movies and tv shows such as “Transformers” and “The Terminator” igniting his imagination, wondering what the future holds. 40 years later, expecting a robotic revolution, our world has come to a point where in today’s consumer robotics there are mainly vacuum cleaners. Not exactly the revolution I imagined…
Today is the “Golden Age” of single-mission robots. In the next 15-20 years, we will see more robots doing one thing, one task. From drone deliveries, autonomous warehousing robots to cleaning and inspection robots. By 2025, 50K warehouses and logistics centers will be mostly autonomous and will use 4M robots. Industrial and service robot market will increase from 76.6 billion USD in 2020 to 176.8 billion USD in 2025. The robotics revolution starts now.
Building a robot is all about integration. Choosing the correct computing power; choosing the right sensors, the right actuators, not developing them.
The main time-consuming aspect of robot development is the software. By shifting to the cloud, robotic companies can benefit, improve, and save valuable time and money. It can reduce problems by utilizing data to gain knowledge, discover market trends, learn usage profiles and optimize robot components, and eventually increase business results.
Until today, companies had to develop their own tools for teleoperation, over-the-air (OTA’s) updates and upgrades, fleet management, monitoring, analytics, and remote debugging. Creating this cloud infrastructure is a real challenge, resulting in a longer time-to-market period for robotic companies.
Cogniteam’s cloud-based solution, enables robotic companies to focus on their core IP development while integrating off-the-shelf software – because robotics is all about integration. Starting a robotic company with the correct robotic cloud-based infrastructure to begin with, makes all the difference! Expedite your time-to-market, enhance your capabilities, and outperform your competitors.
Book a demo and start your way to the finish line.
A brilliant yet urgent idea that needs to be tested, verified, and implemented in the shortest time frame possible. Off-the-shelf components and ROS are great for that.
Grab a depth cam, strap on a lidar, secure it on a skateboard, control it with a raspberry pi and an Arduino, download a ROS-enabled mapping node, open Rviz, and there you have it.
Done. Your first demo (loud crowd cheers in the background).
In this setup – Hardware: Seeed Jetson Sub Kit, Velodyne Lidar, RealSense. Software: Cogniteam
Then a few months in, maybe a year, maybe more, somewhere along the way the direction blurs out. The intentions are clear. During early integration tests several operation modes needed to be managed so a simple state machine was implemented, recordings needed to be logged so one of the engineers built a tool for that, maybe the robot needed to be controlled remotely for a client demo, so image streaming became an issue, several robots might have had slightly different calibration parameters, configurations, so tools were built, scripts were written.
Minor things that are just slightly derailing from the original task, grew by just a bit.
Developing a platform takes time, and as your team grows personnel changes, standards become critical, tools are needed. That is when many systems become EOL (end of life), tools become obsolete and maintenance becomes an issue. These tasks are so time-consuming that they block progress, delay goals, and eventually result in an unstable product that is behind schedule.
This is where focus comes into the picture. To bring focus to a robotic company, we believe that two questions need to be answered, they are the same for both the hardware and the software aspects of robotics.
Many tools and frameworks are needed to develop robots and the more a company spends time and effort on its own IP, the better is the return on that money. In a competitive market, companies differ by excelling at their core innovative features. Do not try to rebuild every tool you need.
When deciding to develop an internal solution, or opting to use a simpler one, technical debt is induced. The technical ability of the company to develop that same solution at any given time (if the need arose), and the ability to integrate it within their software stack. A robotic company starts off with a prototype and 3 founders, but if successful will grow very quickly, and will need to be able to support deployment and production at large scales. All robotic companies will need to develop tools for remote access, updates, data profiling, permission management, and much more. Tools that are not needed when they are a small company with a prototype but are absolutely crucial for the company’s success eventually. Does every company need to re-develop its own tools for that?
By using a framework such as our platform, risks can be mitigated, enabling the companies to go on and develop their IP, while gaining the flexibility of a community-oriented and commercially supported framework that enables them to integrate cutting-edge services without the risk of getting stuck behind. Use ROS, or any other architecture that enables you to quickly prototype and implement your ideas, but be sure to deploy, test, and package it using the state of the art tools. Do not reinvent your own tools for that. Keep your development team focused on the things that differentiate your company from your competitors by focusing on how your robot excels at its task.
Focus is one of the key aspects of product development and this is Cogniteam’s goal in your development chain. To bring tools and order that will enable you to focus on your IP.
And this is your skateboard, with a 3D lidar and a Stereo camera :-)
Well… you can! And I’m not just talking about running a simulated robot, I’m talking about turning your own laptop
into a “robot” and run cool stuff on it using our platform.
Here is a video tutorial on how to do it step by step:
Basically, you need to do 3 things:
When you add a new robot you’ll get an installation string you can just copy and paste to your Linux terminal and let the platform do all the rest.
When your “robot” is online click on it and go to -> Configuration
Congratulations, you’re at step B!
See how easy it is to build a configuration (01:32 in the tutorial) OR just go to the public configurations and use the one I made for you, look for the configuration: Try Nimbus On Your Laptop Configuration on Hub->Filter by Configurations -> Page 2 -> try_nimbus_on_your_laptop_configuration
Let’s review the different components and their interactions (see the following figure).
The component in red is the webcam driver. It outputs a raw uncompressed image (image_raw). This output is the input of 3 different components. At the top, you’ll find the openvino-cpu-detection. This component wraps an algorithm that uses a Neural-Network to detect objects. It outputs an image where different objects are labeled with text and bounding boxes. This output is connected to the image-to-image-compressed component that, as you can imagine, compresses the image to reduce bandwidth.
Next is the hands-pose-detection component which outputs an image with hand skeleton recognition. Such components are useful for gesture detection which can communicate different instructions to a robot.
At the bottom, you’ll find the circle-detection component which outputs an image with the biggest circle detected in the image. Such components are useful for tracking items such as balls. You can easily write a behavior that tells the robot to follow a ball or to forage balls etc.
You are ready to have some fun!
Deploy the configuration. Wait for it to install – you can see the progress in the Overview tab of the robot’s dashboard. Now hit the Streams tab and view the different image streams.
So there you go. You’ve configured the robot, and as you can imagine you can tell the robot to view some gestures that tell it to follow you or to stop and maybe to follow a ball and so on…
Have fun!
Actually, we were early adopters of Player/Stage, the ROS predecessor. Back then, vendors were hesitant to join in the Robotic Operating System buzz. Generic protocols, software packages, and visualization tools were something that each company would have developed internally, again and again. Linux was considered good for the academy and hackers, and Windows was competing to get a foot into the robotic market with Windows Robotic Studio.
Back then, making a driver work would usually mean compiling your own Linux Kernel, reading through some obscure forums by the light of a candle, or as my lab professor would say “Here be dragons.” By the time you could see real image data streaming through your C++ code, your laptop graphic display driver would usually stop working due to incompatible dependencies and Ubuntu would crash on boot.
By now, more than a decade has passed. ROS has come into the picture, making data visualization, SLAM algorithms, and navigating robots something that anyone with some free time and a step-by-step tutorial can follow through, test, and customize. Robotic Sensor / Platform vendors themselves are now accepting ROS and releasing git repositories with ready-made ROS nodes — nodes that they themselves used to test and develop the hardware.
This gives the impression that the basic growing pains of robotic software development are now long gone. Buying off-the-shelf components, and building your own robot has never been easier, and that is before we even talk about simulation tools and the cloud.
Here is a weird fact, most of the robots created today are still closed boxes; the OS cannot be updated, and they are not ROS-based. iRobot, for example, discussed their intention in 2019 to move away from a proprietary operating system to a ROS based one, (source) and is currently using ROS only for testing its infrastructure in AWS Robomaker (source). This is just one example. If you take a look at the robots around you, most of the issues they face will never be solvable and their behavior will not change greatly. Today’s ROS-based robots will be replaced by a whole new OS with a full-blown new ROS release in a new robot. But what happened to the old robot and the old code? Remember when phones behaved that way? Before FOTA (Firmware Over The Air). Before App stores. Before Android.
Here is another fun fact: every new robot out in the wild has a blog. There is a YouTuber out there uploading a review for it, and he is comparing it to all the other vendor’s models and algorithms. Why are the companies not doing that to start with? Whole setups are hard to break down and reassemble. Program flows are not transferable. We know, in 2012 we released some basic behavior tree decision-making code to ROS. It took until ROS2 to see a behavior engine first used as a ROS standard component. Have you ever tried to reconfigure move-base between robots, set up TF’s, and reconfigure thresholds for negative and positive obstacles when the sensor type or position changed? Updating its simulative model? Making sure its dependencies are met between the various ROS versions provided by the vendor?
Sounds like we are back to square one, doesn’t it?
Cogniteam started with those examples in mind, as a way to break the cycle by providing tools to develop, package, deploy and manage cloud-connected robots. Our cloud-based platform uses containerized applications as software components. On our platform, these software components can be organized, connected, and reassembled by code, console interface, or from the web using GUI, making anyone (even without ROS specific know-how) able to understand and see the various building blocks that compose the robot execution. The goal of deconstructing the mission to containerized blocks is also to untie the problematic coupling of OS and ROS versions by providing isolation and enabling using various ROS distributions on the same robot, including ROS1 and ROS2 components together.
Components can now be easily replaced making testing of alternative algorithms easier and robot access can be shared between operators and developers to allow remote access to the robot at any time. This does not require installing anything on the robot itself as all the installations are being managed by an agent running as a service on the robot. Multiple users can see live data or access the robot configuration and change it. Using the platform backbone is like having a whole DevOp team inside your team.
The robot configuration and tools for viewing and editing it are also important aspects of the platform. By building the robot model on the platform (configuring the robot sensors and drivers), you can keep track of driver versions, monitor the devices, generate TF’s (coordination transformation services to components), and auto-generate a simulation for your code, thus enabling you to change sensor location/type and simulatively test alternative scenarios – all without any coding. The platform also provides introspection, visualization, and coming soon will be analytic tools to ease the development of robots and bring on the robotic revolution.
Log in, start developing right now, and stay tuned for what is to come.
We have just started.