Many electronic devices offer touch capability as a way for the user to interact with computer systems, such as the iPhone or iPad display. While touch communication is found in almost all devices, there are areas where such interactions can be useful; But at the moment, it is not possible to use them. For example, the teacher in the classroom may use a projector on the wall to display content to students.
At present, it is possible for the teacher to interact with the projector using the virtual buttons or controller and change the displayed content; But this is done while the teacher is directly connected to the host computer or uses a very expensive smart board with the ability to recognize touch commands.
Apple’s patent is registered with the U.S. Patent and Trademark Office as a “laser input detection system based on a combination of two-dimensional and three-dimensional user scans” and points to ways in which such interactions can be made without the need for sensitive systems. Provide touch, on the wall or any other surface. Simply put, this invention revolves around transmitting light and detecting any light reflected to the device. For example, consider the radiation of a given surface from an object that can obscure the path to another surface. This hidden object can be a pen or a user or the same object that must be detected by the laser system and pulses.
To do this, Apple suggests that the light emitted from the projector to detect radiation could be a laser diode, suggesting the use of a vertical surface emitting laser (VCSEL). In fact, the proposed system is the same one used to identify Face ID biometrics on iPhones. It makes sense to use a laser to irradiate the image; Because it provides a more uniform light compared to conventional lamp-based floodlights, it provides the best chance of direct reflection to the device.
In the documentation of Apple’s latest invention, to Interference Also mentioned. Interference is a method of measuring distances and other points of data by mixing several light sources to create an interference pattern that can be analyzed for displacement. Using interference, the system can measure not only the distance traveled by the light to reach the projector level, but also the distance between each blurred object from the same target surface.
Apple believes that by tracking the position and distance from the spotlight, the system can intelligently detect which objects are moving and intend to interact with the spotlight display. This process can involve movements of an object that are defined as motion. For example, this process can be the movement of the user’s finger; As if it touches a surface or even a non-tactile movement in the air made by the user with his hands and fingers.
In Apple’s proposed process, given that short periods are given to the system to calibrate the floodlight display to the new environment, in theory, such a system would be applicable at almost any level. In this system, a smooth surface may not even be needed to execute touch commands, and it can work well on curved or uneven surfaces by collecting data points remotely. This invention was registered on May 9, 2019.
The concept of providing visual content on surfaces and interacting with them without the use of touch screens has been around for a long time in the tech world, and probably the most popular of these are virtual keyboards emitted on surfaces that have been conceptualized in recent years. Usually by emitting infrared light on a flat surface, these keyboards can detect pressures and commands that the user exerts on the content. It should be noted that these devices have been in development for about two decades; But for some reason, such as carelessness and slowness compared to physical keyboards, they are mostly not welcomed by users.
These keyboards are usually attached to an invisible infrared light pattern to be emitted on the same surface. In fact, when the user taps on the emitted virtual key space, the data is detected and applied by the image sensors on the device. Although similar in practice, this is not the same technique proposed in Apple’s patent; Because Apple uses optical emitters to measure distance instead of the secondary invisible light pattern.
In the past, the Cupertinos have worked several times on inventions with light emitting diodes, and are somehow very determined to introduce this technology in the future. For example, in October 2020, the company patented a patent for irradiating objects to provide augmented reality-based images. In addition, in 2013, Apple thought it could obsolete current laptops or desktops by offering a desk that uses a projector instead of an LCD display, and even use wires to induce charging.
Apple patents countless inventions every week. A patent represents the topics of interest to a company’s research and development unit; But it does not guarantee that it will be offered in a future product or service. What do you, the users of Zomit, think about Apple’s new invention for touch-leveling and its application?