Invented by Sridhar Kocharlakota, Moiz Sonasath, Brett Foster, Keshav Byri, Kuldeep Modi, Nalena Santiago, Siva Boggala, Siva Penke, Samsung Electronics Co Ltd
The Samsung Electronics Co Ltd invention works as followsMethods and devices for controlling user interface windows within a virtual reality environment. The method comprises displaying a virtual environment window on a display on a head-mountable device (HMD), where the viewable region is determined by the orientation of the HMD. The method also includes receiving a lock request for a window position in the virtual reality environment and locking that position in the virtual reality area in response. When locked, the position of the HMD is fixed relative to the viewable region, regardless of its orientation.
Background for Systems and Methods for Window Control in Virtual Reality Environment
This disclosure is about virtual reality environments in general. This disclosure is more specific and relates to methods and systems for controlling user interface windows within a virtual environment.
Virtual reality headsets have become more common. The user interfaces of virtual reality headsets offer unique opportunities as well as unique challenges. The use of head movement tracking and gaze trackers allows a user virtually to explore the viewable space on a virtual display. However, it can make it hard to concentrate.
Embodiments” of the present disclosure are systems and methods that control user interface windows within a virtual environment.
In one embodiment, the method is disclosed. The method comprises displaying a virtual environment window on a display on a HMD (also known as a Head Mounted Display), where the viewable space is determined by the orientation of the head mountable display. The method also includes receiving a lock request for a window position on the display, and locking that position on the screen in response. When locked, the position of the HMD is fixed relative to the orientation on the display.
In a second embodiment a HMD has been disclosed. The HMD includes a display, and a processor. The processor is configured for the display to display the window in the viewable region of a HMD display, where the viewable region of the HMD display is determined by the orientation of the HMD. The processor is also configured to receive a lock request and lock the window’s position on the screen in response.
In a third embodiment a nontransitory computer readable medium containing a computer programme is disclosed. The computer program includes computer-readable program code which, when executed, instructs at least one processor on how to cause the display to display a HMD window in the viewable region of the display, where the viewable region is determined by the orientation of the HMD. When executed, the computer-readable program code causes the atleast one processor to receive and lock a window’s position on the screen in response to a lock request. The window’s position is then fixed on the screen, relative to its orientation, regardless of how the HMD is oriented.
In a fourth embodiment a HMD has been disclosed. The HMD consists of a display and processor. The processor is configured for executing an application to complete a plurality tasks. The processor is also configured to generate display content that includes a plurality windows, with each window corresponding to one of at least the tasks. At least one of these windows is displayed on the display in the viewable area, and the at least displayed window is displayed according to the orientation of the HMD. The processor is configured to cause the display, in response a change in orientation of the HMD to display another of the windows for the other tasks in the viewable region.
In a fifth embodiment, it is disclosed a non-transitory medium containing a computer programme. The computer program consists of computer-readable program code which, when executed, allows at least one processor execute an application to complete a number of tasks using a HMD. When executed, the computer-readable program code also causes the processor to create for display content that includes a plurality windows, with each window corresponding to one of at least the tasks. At least one of these windows is displayed on a display in a viewable region, and the atleast one window is displayed according to an orientation of the HMD. When executed, the computer-readable program code causes the atleast one processor to cause the display, in response a change in orientation of the HMD to display other windows for other tasks in the viewing area.
The following figures, descriptions and claims will reveal other technical features to a person skilled in the arts.
It may be helpful to define certain terms and phrases that are used in this patent document before beginning the DETAILED DESCRIPTION. The term “couple” is used. The term ‘couple’ and its derivatives refers to any communication, direct or indirect, between two or more components. The terms “transmit” and “receive” are used interchangeably. ?receive,? The terms’receive,’ Both direct and indirect communication are included in the terms?include? Include and include? The terms ‘include’ and ‘comprise? The term?include’ and derivatives thereof mean to include without limitation. The word?or? The term?or? is inclusive and means both. The phrase “associated with” is used. As well as derivatives of the term, it means to include, to be contained in, to interconnect with or to contain, to couple with or to connect with, to be communicable, to cooperate with or to be near, to be proximate, to be bound to or to be with, to have, to have a property, to have a relation to or to be with. The term “controller” is used. The term ‘controller’ refers to any device, system, or component thereof that controls at lease one operation. A controller can be implemented as hardware, software or firmware. Functionality associated with a controller can be distributed or centralized, locally or remotely. The phrase “at least one” is used. When used in conjunction with a list, it means that different combinations can be made of any of the items listed, but only one may be required. As an example, “at least one: A, C, and B?” Includes any of the combinations A, B, and C. Also, includes A and B. A and C. B and C. And A and B.
Moreover, the functions described below may be supported or implemented by one or several computer programs. Each of these computer programs is composed from computer-readable program code, and each is embodied on a computer-readable medium. The terms “application” and “program? The terms?application? Refer to one or more computer program, software component, set of instructions, procedures or functions, objects, classes or instances, related data or a part thereof, adapted for implementation into a suitable computer-readable program code. The phrase “computer readable code” is used. The phrase “computer readable program code” includes all types of computer code including source code and executable code. The phrase “computer-readable medium” is used. Any type of medium that can be accessed by a PC, including a CD, DVD, or hard drive. A “non-transitory” medium. A computer-readable medium does not include wired, wireless or optical communication links that transmit transitory electrical signals or other signals. Non-transitory computer-readable media includes media that can permanently store data and media that can store data and then overwrite it, such as an optical disc or erasable memory devices.
The patent document contains definitions for certain other words and phrases. The ordinary person with knowledge of the art will understand that, in most cases, these definitions are applicable to both past and future uses.
FIGS. The various embodiments described in this document, as well as the figures 1-9 discussed below are only for illustration purposes and should not be taken to limit the scope or application of the disclosure. The principles of this disclosure can be implemented into any system or device that is suitable.
Embodiments in the present disclosure acknowledge that users of virtual-reality headsets can use head movement and gaze tracking for intuitive navigation within a virtual-reality environment. A virtual reality headset’s display component or processor can generate the virtual reality environment. Head movement tracking and eye tracking can be used with the large viewing area (up to 360 degrees) of the virtual environment to create different productivity tasks. Viewable area is a part of the virtual environment displayed on the headset display at any given time. The use of gaze tracking and head movements can be very intuitive and useful to navigate large virtual reality environments (e.g. by panning the viewing area around the virtual environment), but a user might want to focus on one specific task, without moving the viewing area. It can be difficult to maintain the viewable area in one place to perform productivity tasks, such as data entry or word processing.
Accordingly, “the present disclosure includes methods and systems that allow a user of a virtual reality device lock certain tasks within the viewable region of the virtual environment. The user might want to lock an element of the user interface (such as window) within the viewable space, while still allowing the viewable space to be moved around the window by using gaze tracking and head movements. In a different embodiment, the user might want to temporarily ignore gaze tracking and head movement while locking the viewable region in place.
FIG. FIG. 1 shows an example communication system in which a embodiment of the current disclosure can be implemented. FIG. 1 shows the embodiment of the communication network 100. The FIG. 1 is merely for illustration purposes. “Other embodiments of communication system 100 can be used without straying from the scope and intent of this disclosure.
As shown in FIG. The system 100 is comprised of a network 102 that facilitates communication among various components within the system. The network 102 can communicate Internet Protocol (IP), frame relay frames or other information to network addresses. The network 102 can include local area networks, metropolitan area networks, wide area networks or all or part of the Internet.
The network 102 facilitates communication between the various servers 103 and 104″ and various electronic devices 106 to 116. Each server 104 can include any computing or processing device capable of providing computing services to one or more devices. “Each server 104 can, for example include one or multiple processors, as well as one or several memories that store instructions and data and one or many network interfaces to facilitate communication over the network.
Each electronic devices 106-116 represent any computing or communication device which interacts with a server or another computing device over the network 102. In this example, electronic devices 106 to 116 are electronic devices such as a desktop computer, mobile phones or smartphones, laptop computers 112, tablet computers 114, a HMD, smart watches, etc. The communication system 100 can use any additional or other electronic devices.
Click here to view the patent on Google Patents.