US20240320908A1 - Systems, methods, and non-transitory computer-readable mediums for displaying a virtual space - Google Patents
Systems, methods, and non-transitory computer-readable mediums for displaying a virtual space Download PDFInfo
- Publication number
- US20240320908A1 US20240320908A1 US18/186,553 US202318186553A US2024320908A1 US 20240320908 A1 US20240320908 A1 US 20240320908A1 US 202318186553 A US202318186553 A US 202318186553A US 2024320908 A1 US2024320908 A1 US 2024320908A1
- Authority
- US
- United States
- Prior art keywords
- user
- virtual
- computer implemented
- virtual space
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000013461 design Methods 0.000 claims abstract description 16
- 238000004891 communication Methods 0.000 description 12
- 230000015654 memory Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 3
- 239000003086 colorant Substances 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0207—Discounts or incentives, e.g. coupons or rebates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
Definitions
- the present disclosure relates to systems, methods, and non-transitory computer-readable mediums for displaying a virtual space.
- Graphical user interfaces display a computer-generated environment to a user. This may be done through a conventional graphical user interface of a user device such as a phone or tablet.
- the computer-generated environment may also be displayed through a virtual reality (VR) headset, which creates a sense of presence and immersion within the computer-generated environment.
- VR virtual reality
- Conventional computer-generated environments are generalized and do not take into account preferences of the user when displaying the computer-generated environment. However, customization of the computer-generated environment plays a critical role in a nature of the experience a user has with the user device.
- the present disclosure provides systems, methods, and non-transitory computer-readable mediums for displaying a virtual space.
- the systems, methods, and non-transitory computer-readable mediums display a virtual space customized to user preferences by receiving information from a user profile and displaying virtual features based on the information from the user profile.
- the user information may include a user location, user design preference, or at least one user metric. With a customized virtual space, the user may have a more enjoyable and customized user experience when viewing the virtual space through the graphical user interface.
- a computer implemented system for displaying a virtual space includes a graphical user interface and a controller.
- the controller is programmed to receive user information from a user profile.
- the user information includes a user location, a user design preference, and at least one user metric.
- the controller is also programmed to determine display conditions based on the user information, determine user accessibility to virtual features based on the at least one user metric, and display the virtual space including the virtual features on the graphical user interface based on the display conditions and the user accessibility.
- the virtual space is displayed as a virtual 3D rendered image.
- a computer implemented method for displaying a virtual space includes receiving user information from a user profile, wherein the user information comprises a user location, a user design preference, and at least one user metric.
- the method further includes determining display conditions based on the user information, determining user accessibility to virtual features based on the at least one user metric, and displaying the virtual space including the virtual features on a graphical user interface based on the display conditions and the user accessibility.
- the virtual space is displayed as a virtual 3D rendered image.
- FIG. 1 depicts an exemplary virtual space displayed by a computer implemented system for displaying the virtual space, according to one or more embodiments shown and described herein;
- FIG. 2 illustrates an example of a user viewing a virtual car dealership through a virtual reality device, according to one or more embodiments shown and described herein;
- FIG. 3 depicts a schematic diagram of the system for displaying the virtual space, according to one or more embodiments shown and described herein;
- FIG. 4 depicts a flowchart for a method for displaying the virtual space, according to one or more embodiments shown and described herein.
- the present disclosure provides systems, methods, and non-transitory computer-readable mediums for displaying a virtual space.
- the systems, methods, and non-transitory computer-readable mediums display a virtual space customized to user preferences by receiving information from a user profile and displaying virtual features based on the information from the user profile.
- the user information may include a user location, user design preference, or at least one user metric.
- the systems, methods, and non-transitory computer-readable mediums may further use a virtual reality (VR) device, an external user application, or a vehicle processor to display and customize the virtual space to the user.
- VR virtual reality
- Customization of the virtual space through the utilization of user preferences generates a more familiar, customized, and enjoyable virtual space for the user.
- connection to a user profile and real-time customization of the virtual space and offering of discounts provides a significant advantage over prior, non-customizable systems.
- FIG. 1 depicts an exemplary virtual space 102 displayed by a computer implemented system 100 for displaying the virtual space 102 , according to one or more embodiments shown and described herein.
- the virtual space 102 displayed by a system 100 may be displayed through a graphical user interface 104 .
- the virtual space 102 may be displayed as a 3D rendered image.
- a controller 106 (discussed further below), may be programmed to display the virtual space 102 on the graphical user interface 104 .
- the virtual space 102 may be any number of environments the user wishes the graphical user interface 104 to display.
- the virtual space 102 may include a virtual shopping center, a virtual concert, or any other suitable virtual space 102 that the user wishes top display.
- the virtual space 102 is depicted as a virtual car dealership 103 .
- the virtual space 102 may be based on real images.
- real images of a dealership may be stored in a memory of the present system.
- the real images may be collected from publicly accessible data, e.g., images from social networking service, images from official homepages of the dealership, and the like.
- the controller 106 of the present system may transform the 2D real images into 3D rendered images that may be used for virtual reality experience.
- the virtual car dealership 103 may render virtual vehicles 107 , virtual buildings 108 , or virtual users 110 in the virtual space 102 .
- the virtual car dealership 103 may appear similar to a physical car dealership.
- the virtual car dealership 103 may be designed to replicate a particular physical car dealership.
- the present system obtains images of actual vehicles displayed on the property of the physical car dealership, and renders 3D images of corresponding vehicles based on the obtained images.
- the present system obtains the inventory information of the particular physical car dealership and renders 3D virtual vehicles based on the inventory information.
- the inventory information may include detailed information about the vehicles such as make, models, colors, accessories, and the like.
- the virtual car dealership 103 may also be a generic representation of the physical car dealership.
- the virtual car dealership 103 may be a computer generated car dealership that does not exist as a physical car dealership.
- one or more icons, texts, or other graphical elements may indicate descriptions of vehicles, prices of vehicles, or other relevant information in the virtual car dealership 103 .
- a type/style of virtual car dealership 103 is displayed may depend on user information received from a user profile (as described further below).
- the rendering of the 3D virtual vehicles may be dynamically updated based on the status of physical vehicles. For example, when a customer is test driving a physical vehicle at a dealership, the physical vehicle may transmit, to the controller 106 , its current status such as information that the physical vehicle is current driving, or the current location of the physical vehicle.
- the controller 106 may update the rendering of the virtual vehicle 107 corresponding to the physical vehicle based on the information physical vehicle is currently driving or the current location of the physical vehicle. For example, the controller 106 may update the color of the virtual vehicle 107 to gray to indicate that the virtual vehicle is not available. As another example, the controller 106 may remove the virtual vehicle 107 from the virtual dealership.
- FIG. 2 illustrates an example of the user viewing the virtual car dealership 103 through a VR device 112 , according to one or more embodiments described and illustrated herein.
- the graphical user interface 104 may be a user interface of a user device 111 .
- the user device 111 may be a phone, tablet, a computer, or any other device that may display the graphical user interface 104 .
- the user device 111 may include the graphical user interface 104 and the controller 106 .
- the user device 111 may include the VR device 112 .
- the VR device 112 may include a VR headset 113 that the user wears to view the virtual space 102 .
- the user is viewing the virtual car dealership 103 .
- the user may also view virtual shopping centers, virtual concerts, virtual sports games, or any other suitable virtual space 102 the user wishes to view on the graphical user interface 104 of the user device 111 .
- the user may interact with the virtual space 102 .
- the user may interact with the virtual space 102 through the graphical user interface 104 when the user device 111 includes a graphical user interface 104 that is touch-screen.
- the user may select certain objects, change the view, or navigate the virtual space 102 through the graphical user interface 104 of the user device 111 .
- the user device 111 may include a microphone 114 , such that the user can speak commands into the user device 111 in order to interact with the virtual space 102 .
- the VR device 112 may include hand controllers 115 , allowing the user can interact with the virtual space 102 through the hand controllers 115 .
- FIG. 3 depicts a schematic diagram of the system 100 for displaying the virtual space 102 , according to one or more embodiments shown and described herein.
- the virtual space 102 may be displayed through the controller 106 .
- the controller 106 may be included in a server, e.g., a cloud server, an edge server, and the like.
- the controller 106 may be programmed to receive user information from a user profile, determine display conditions based on the user information, determine user accessibility to virtual features based on at least one user metric, and display the virtual space 102 on the graphical user interface 104 .
- One or more of the controller 106 functions may be executed in real time.
- the controller 106 may include a processor 118 and a memory unit 120 .
- the processor 118 can be any device capable of executing machine-readable and executable instructions 128 .
- the processor 118 may be a controller, an integrated circuit, a microchip, a computer, or any other computing device.
- the processor 118 is coupled to a communication path 132 that provides signal interconnectivity between various modules of the system 100 .
- the communication path 132 may communicatively couple any number of processors 118 with one another, and allow the modules coupled to the communication path 132 to operate in a distributed computing environment.
- Each of the modules may operate as a node that may send and/or receive data.
- the term “communicatively coupled” means that coupled components are capable of exchanging data signals with one another such as electrical signals via a conductive medium, electromagnetic signals via air, optical signals via optical waveguides, and the like.
- the communication path 132 may be formed from any medium that is capable of transmitting a signal such as conductive wires, conductive traces, optical waveguides, or the like. In some embodiments, the communication path 132 may facilitate the transmission of wireless signals, such as WiFi, Bluetooth®, Near Field Communication (NFC), and the like. The communication path 132 may be formed from a combination of mediums capable of transmitting signals. In one embodiment, the communication path 132 comprises a combination of conductive traces, conductive wires, connectors, and buses that cooperate to permit the transmission of electrical data signals to components such as processors, memories, sensors, input devices, output devices, and communication devices.
- the communication path 132 may comprise a vehicle bus, such as a LIN bus, a CAN bus, a VAN bus, and the like. Additionally, it is noted that the term “signal” means a waveform (e.g., electrical, optical, magnetic, mechanical, or electromagnetic), such as DC, AC, sinusoidal-wave, triangular-wave, square-wave, vibration, and the like, capable of traveling through a medium.
- a waveform e.g., electrical, optical, magnetic, mechanical, or electromagnetic
- the memory unit 120 may contain non-transitory computer-readable medium comprising RAM, ROM, flash memories, hard drives, or any device capable of storing machine-readable and executable instructions 128 such that the machine-readable and executable instructions 128 can be accessed by the processor 118 .
- the machine-readable and executable instructions 128 may comprise logic or algorithm(s) written in any programming language of any generation (e.g., 1GL, 2GL, 3GL, 4GL, or 5GL) such as, for example, machine language that may be directly executed by the processor 118 , or assembly language, object-oriented programming (OOP), scripting languages, microcode, etc., that may be compiled or assembled into machine-readable and executable instructions 128 and stored in the memory unit 120 .
- any programming language of any generation e.g., 1GL, 2GL, 3GL, 4GL, or 5GL
- OOP object-oriented programming
- the machine-readable and executable instructions 128 may be written in a hardware description language (HDL), such as logic implemented via either a field-programmable gate array (FPGA) configuration or an application-specific integrated circuit (ASIC), or their equivalents.
- HDL hardware description language
- FPGA field-programmable gate array
- ASIC application-specific integrated circuit
- the methods described herein may be implemented in any conventional computer programming language, as pre-programmed hardware elements, or as a combination of hardware and software components.
- the processor 118 along with the memory unit 120 may operate as the controller 106 for the system 100 .
- the controller 106 may be communicatively coupled to the user device 111 (depicted in FIG. 1 ), a processor of the user vehicle 134 (discussed further below), a weather reporting system 116 (discussed further below), or each of the foregoing, by a network 130 .
- the network 130 may include one or more computer networks (e.g., a personal area network, a local area network, or a wide area network), cellular networks, satellite networks and/or a global positioning system and combinations thereof.
- the controller 106 may be communicatively coupled to the network 130 via a wide area network, a local area network, a personal area network, a cellular network, a satellite network, etc.
- Suitable local area networks may include wired Ethernet and/or wireless technologies such as Wi-Fi.
- Suitable personal area networks may include wireless technologies such as IrDA, Bluetooth®, Wireless USB, Z-Wave, ZigBee, and/or other near field communication protocols.
- Suitable cellular networks include, but are not limited to, technologies such as LTE, WiMAX, UMTS, CDMA, and GSM.
- the controller 106 may receive user information from the user profile.
- the user information on the user profile may be manually entered by the user through the user device 111 .
- the user may input a name and address into the user profile, and the user information may be autocompleted based on a user profile database 124 stored on the memory unit 120 that stores user information.
- the controller 106 may also be communicatively coupled to an external user application 122 .
- the external user application 122 may be an application installed on the user device 111 .
- the external user application 122 may include a video-streaming application, a gaming application, a ride-share application, or any other external user application 122 that may be installed on the user device 111 .
- the controller 106 may receive user application metrics from the external user application 122 .
- the user application metrics and the external user application 122 are discussed further below.
- the user information may include a user location, a user design preference, current vehicle information, at least one user metric, or any other user information that may be stored on the user profile.
- the user location may be determined through a global positioning system (GPS) of the user device 111 .
- GPS global positioning system
- the user location may include a country, state, city, or zip code that the user is located in.
- the user may set the user location to a location different than that of the physical location of the user determined through the GPS of the user device 111 . For example, the user may set the location as New York City, even though the GPS of the user device 111 has determined that the user location is West Virginia.
- the user information may also include user demographic information on the user profile.
- the user demographic information may include age, gender, marital status, or any other demographic information that may be stored on the user profile.
- the user demographic information may assist the controller 106 in determining the display conditions to display on the graphical user interface 104 , as discussed further below.
- the user design preference may include preferences on how the user prefers the virtual space 102 to appear.
- the user design preference includes a preferred architecture.
- the user may select the preferred architecture from a plurality of architectures listed on the graphical user interface 104 .
- the plurality of architectures may include modern, colonial, Georgia, or any other architecture type that the user may prefer.
- the user preferences may also include a brightness of colors (i.e., light or dark colors), a time of day, or any other preference the user may have when viewing the virtual space 102 .
- the virtual space 102 may be designed to include the preferred architecture. For example, if the preferred architecture is modern, the virtual buildings 108 may be of a modern architecture. On the other hand, if the preferred architecture is colonial, the virtual buildings 108 in the virtual space 102 may be of a colonial architecture.
- a plurality of user profiles may be included on the user profile database 124 stored on the memory unit 120 .
- the plurality of user profiles include user information corresponding to the users in each of the user profiles.
- the user design preference for a new user may be automatically determined by the controller 106 by comparing the user information of the user profile of the new user with the user information stored on the plurality of user profiles of users that have similar background as the new user on the user profile database 124 .
- the controller 106 may recognize that user profiles in the plurality of user profiles that are male, 50 years old, and located in Cleveland, Ohio, often select a colonial architecture from the plurality of architectures. Based on this recognition, the controller 106 may automatically determine the user design preference as the colonial architecture type for the user profile with similar user information (i.e., male, 48 years old, located in Columbus, Ohio).
- the at least one user metric may be based on metrics of a user vehicle.
- the user vehicle may be a vehicle including an automobile or any other passenger or non-passenger vehicle such as, for example, a terrestrial, aquatic, and/or airborne vehicle.
- the user vehicle may be an autonomous driving vehicle.
- the user vehicle may be an autonomous vehicle that navigates its environment with limited human input or without human input.
- the processor of the user vehicle 134 may be communicatively coupled to the controller 106 (depicted in FIG. 3 ). As such, the controller 106 may monitor a plurality of driving metrics recorded by the processor of the user vehicle 134 and classify the plurality of driving metrics as the at least one user metric.
- the at least one user metric may include a number of miles the user vehicle has been driven.
- the number of miles the user vehicle has been driven may include a total number of miles that the user vehicle has been driven.
- the number of miles of the user vehicle may be transmitted from the processor of the user vehicle 134 to the controller 106 in real time.
- the number of miles the user vehicle has been driven may be a number of miles the user vehicle has been driven since the user has purchased the user vehicle (in scenarios where the user buys a used vehicle) or an average number of miles the user vehicle is driven per day, week, month, or year.
- the at least one user metric may also include a number of consecutive days, weeks, or months the user vehicle has been driven. For example, the at least one user metric may be that the user vehicle is driven every day, or that the user vehicle is driven only Monday through Friday.
- the processor of the user vehicle 134 connected to the controller 106 may include a GPS of the user vehicle. Therefore, the at least one user metric may include a location history 126 of the user vehicle.
- the location history 126 of the user vehicle may be stored on the memory unit 120 (as depicted in FIG. 3 ) and may include routes the user vehicle has driven, a number of counties or states the user vehicle has been driven to, a number of National Parks the user vehicle has been driven in, or any other suitable location history 126 of the user vehicle.
- the controller 106 may also be programmed to determine display conditions based on the user information.
- the display conditions correspond to how the virtual space 102 appears to the user through the graphical user interface 104 .
- the display conditions may directly correspond to the user information from the user profile in order to replicate an environment the user is in. For example, if the user information includes that the user location as Denver, Colorado, and the user design preference is modern, the display conditions may include a mountainous background and virtual buildings 108 in the virtual space 102 may be rendered in a modern architectural style.
- the display conditions may also include a weather condition or a time of day.
- the time of day may change the display conditions in the virtual space 102 , such as an amount of sunlight. For example, if the time of day is 9 PM, no sunlight may be displayed in the virtual space 102 and the display conditions may be adjusted to nighttime.
- the weather condition may also effect the display conditions in the virtual space 102 .
- the weather condition may be retrieved from the weather reporting system 116 communicatively coupled to the controller 106 through the communication path 132 (depicted in FIG. 2 ).
- the controller 106 may obtain information from the weather reporting system 116 by querying the weather reporting system 116 with the user location.
- the controller 106 may then adjust the display conditions in the virtual space 102 based on the weather conditions at the user location.
- the controller 106 may also be programmed to determine user accessibility to virtual features based on the user information or the at least one user metric.
- the virtual features may include features of the virtual space 102 .
- the virtual features may include a locked virtual room.
- the locked virtual room may include discounts that a user can select from once the locked virtual room has become unlocked.
- the virtual features such as the locked virtual room may only be unlocked/accessible if the user information or the at least one user metric meets a threshold condition.
- the user accessibility to the virtual features may be based on whether the user vehicle has entered within a defined location. For example, user accessibility to virtual features of a beach background in the virtual space 102 may be based on whether the user vehicle has travelled within 5 miles of an ocean coastline. In other embodiments, user accessibility to virtual features of the virtual space 102 may be based on whether the user vehicle has entered within a certain number of defined locations. For example, accessibility of virtual features in the virtual space 102 may be based on whether the user vehicle has entered within the defined location of 5 National Parks.
- a notification that the virtual features are now accessible may be transmitted to the user device 111 or to the processor of the user vehicle 134 .
- the user device 111 or the user vehicle 134 may display the accessible virtual features on a screen of the user device 111 or the user vehicle 134 .
- the user accessibility to virtual features may be based on the at least one user metric of the number of miles the user vehicle has been driven. For example, a virtual room in the virtual space 102 described above may be locked if the user vehicle has been driven under 100,000 miles. When the user vehicle has been driven 100,000 miles or over, the virtual room may be unlocked and the user may access the virtual room.
- the unlocked virtual room may include discounts offered to the user, such as a no down-payment offer on a new vehicle, a low interest rate on financing, or $1,000 off a new vehicle. The user may be able to select one, or all of the discounts offered to the user in the unlocked virtual room.
- the controller 106 may provide a route to a nearby dealership that offers discounts to the user vehicle.
- FIG. 4 depicts a flowchart for a method 400 for displaying the virtual space 102 , according to one or more embodiments shown and described herein.
- the method 400 includes receiving the user information from the user profile in block 402 .
- the user information includes the user location, the user design preference, and the at least one user metric.
- the method 400 further includes determining the display conditions based on the user information in block 404 , determining user accessibility to the virtual features based on the at least one user metric in block 406 , and displaying the virtual space 102 including the virtual features on the graphical user interface 104 based on the display conditions and the user accessibility in block 408 .
- the method 400 may include receiving user input.
- the user input may be received through the user device 111 , such as through the graphical user interface 104 , the hand controllers 115 of the VR device 112 , or the microphone 114 .
- the method 400 may also include connecting to the processor of the user vehicle 134 that may record the plurality of driving metrics.
- the method 400 may also include unlocking the virtual room based on the at least one user metric.
- the method 400 may also include connecting the external user application 122 to the controller 106 and receiving the user application metrics from the external user application 122 .
- the user application metrics may be used to determine the display conditions of the virtual space 102 .
- the locked virtual room may be accessed/unlocked based on the user application metrics.
- the user application metrics may relate to a frequency of use of the external user application 122 . For example, if the external user application 122 is a ride-sharing service, the user application metrics may include a number of rides the user has ordered from the ride-sharing service. If the user has ordered a number of rides over a threshold number of rides, the method 400 may include unlocking the virtual room. If the external user application 122 is a gaming application, the user application metrics may include a gaming level the user has completed; if the user has completed a gaming level over a threshold gaming level, the method 400 may include unlocking the virtual room.
- variable being a “function” of a parameter or another variable is not intended to denote that the variable is exclusively a function of the listed parameter or variable. Rather, reference herein to a variable that is a “function” of a listed parameter is intended to be open ended such that the variable may be a function of a single parameter or a plurality of parameters.
- references herein of a component of the present disclosure being “configured” or “programmed” in a particular way, to embody a particular property, or to function in a particular manner, are structural recitations, as opposed to recitations of intended use. More specifically, the references herein to the manner in which a component is “configured” or “programmed” denotes an existing physical condition of the component and, as such, is to be taken as a definite recitation of the structural characteristics of the component.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Finance (AREA)
- General Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- Human Computer Interaction (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Systems, methods, and non-transitory computer-readable medium for displaying a virtual space are provided. The system for displaying a virtual space includes a graphical user interface and a controller. The controller is programmed to receive user information from a user profile, wherein the user information comprises a user location, a user design preference, and at least one user metric. The controller is also programmed to determine display conditions based on the user information, determine user accessibility to virtual features based on the at least one user metric, and display the virtual space including the virtual features on the graphical user interface based on the display conditions and the user accessibility. The virtual space is displayed as a virtual 3D rendered image.
Description
- The present disclosure relates to systems, methods, and non-transitory computer-readable mediums for displaying a virtual space.
- Graphical user interfaces display a computer-generated environment to a user. This may be done through a conventional graphical user interface of a user device such as a phone or tablet. The computer-generated environment may also be displayed through a virtual reality (VR) headset, which creates a sense of presence and immersion within the computer-generated environment. Conventional computer-generated environments are generalized and do not take into account preferences of the user when displaying the computer-generated environment. However, customization of the computer-generated environment plays a critical role in a nature of the experience a user has with the user device.
- Accordingly, a need exists for systems, methods, and non-transitory computer-readable mediums that display a virtual space customized to user preferences.
- The present disclosure provides systems, methods, and non-transitory computer-readable mediums for displaying a virtual space. The systems, methods, and non-transitory computer-readable mediums display a virtual space customized to user preferences by receiving information from a user profile and displaying virtual features based on the information from the user profile. The user information may include a user location, user design preference, or at least one user metric. With a customized virtual space, the user may have a more enjoyable and customized user experience when viewing the virtual space through the graphical user interface.
- In one or more embodiments, a computer implemented system for displaying a virtual space is provided. The system includes a graphical user interface and a controller. The controller is programmed to receive user information from a user profile. The user information includes a user location, a user design preference, and at least one user metric. The controller is also programmed to determine display conditions based on the user information, determine user accessibility to virtual features based on the at least one user metric, and display the virtual space including the virtual features on the graphical user interface based on the display conditions and the user accessibility. The virtual space is displayed as a virtual 3D rendered image.
- In another embodiment, a computer implemented method for displaying a virtual space is provided. The method includes receiving user information from a user profile, wherein the user information comprises a user location, a user design preference, and at least one user metric. The method further includes determining display conditions based on the user information, determining user accessibility to virtual features based on the at least one user metric, and displaying the virtual space including the virtual features on a graphical user interface based on the display conditions and the user accessibility. The virtual space is displayed as a virtual 3D rendered image.
- These and additional features provided by the embodiments of the present disclosure will be more fully understood in view of the following detailed description, in conjunction with the drawings.
- The following detailed description of specific embodiments of the present disclosure can be best understood when read in conjunction with the following drawings, where like structure is indicated with like reference numerals and in which:
-
FIG. 1 depicts an exemplary virtual space displayed by a computer implemented system for displaying the virtual space, according to one or more embodiments shown and described herein; -
FIG. 2 illustrates an example of a user viewing a virtual car dealership through a virtual reality device, according to one or more embodiments shown and described herein; -
FIG. 3 depicts a schematic diagram of the system for displaying the virtual space, according to one or more embodiments shown and described herein; and -
FIG. 4 depicts a flowchart for a method for displaying the virtual space, according to one or more embodiments shown and described herein. - Reference will now be made in greater detail to various embodiments of the present disclosure, some embodiments of which are illustrated in the accompanying drawings. Whenever possible, the same reference numerals will be used throughout the drawings to refer to the same or similar parts.
- The present disclosure provides systems, methods, and non-transitory computer-readable mediums for displaying a virtual space. The systems, methods, and non-transitory computer-readable mediums display a virtual space customized to user preferences by receiving information from a user profile and displaying virtual features based on the information from the user profile. The user information may include a user location, user design preference, or at least one user metric. The systems, methods, and non-transitory computer-readable mediums may further use a virtual reality (VR) device, an external user application, or a vehicle processor to display and customize the virtual space to the user. Customization of the virtual space through the utilization of user preferences generates a more familiar, customized, and enjoyable virtual space for the user. Moreover, connection to a user profile and real-time customization of the virtual space and offering of discounts provides a significant advantage over prior, non-customizable systems.
-
FIG. 1 depicts an exemplaryvirtual space 102 displayed by a computer implementedsystem 100 for displaying thevirtual space 102, according to one or more embodiments shown and described herein. - Referring to
FIG. 1 , thevirtual space 102 displayed by asystem 100 may be displayed through agraphical user interface 104. Thevirtual space 102 may be displayed as a 3D rendered image. A controller 106 (discussed further below), may be programmed to display thevirtual space 102 on thegraphical user interface 104. Thevirtual space 102 may be any number of environments the user wishes thegraphical user interface 104 to display. Thevirtual space 102 may include a virtual shopping center, a virtual concert, or any other suitablevirtual space 102 that the user wishes top display. In the non-limiting example inFIG. 1 , thevirtual space 102 is depicted as avirtual car dealership 103. In embodiments, thevirtual space 102 may be based on real images. For example, real images of a dealership may be stored in a memory of the present system. The real images may be collected from publicly accessible data, e.g., images from social networking service, images from official homepages of the dealership, and the like. Thecontroller 106 of the present system may transform the 2D real images into 3D rendered images that may be used for virtual reality experience. - The
virtual car dealership 103 may rendervirtual vehicles 107,virtual buildings 108, orvirtual users 110 in thevirtual space 102. Thevirtual car dealership 103 may appear similar to a physical car dealership. Thevirtual car dealership 103 may be designed to replicate a particular physical car dealership. For example, the present system obtains images of actual vehicles displayed on the property of the physical car dealership, and renders 3D images of corresponding vehicles based on the obtained images. As another example, the present system obtains the inventory information of the particular physical car dealership and renders 3D virtual vehicles based on the inventory information. The inventory information may include detailed information about the vehicles such as make, models, colors, accessories, and the like. Thevirtual car dealership 103 may also be a generic representation of the physical car dealership. In other embodiments, thevirtual car dealership 103 may be a computer generated car dealership that does not exist as a physical car dealership. In some embodiments, one or more icons, texts, or other graphical elements may indicate descriptions of vehicles, prices of vehicles, or other relevant information in thevirtual car dealership 103. A type/style ofvirtual car dealership 103 is displayed may depend on user information received from a user profile (as described further below). In embodiments, the rendering of the 3D virtual vehicles may be dynamically updated based on the status of physical vehicles. For example, when a customer is test driving a physical vehicle at a dealership, the physical vehicle may transmit, to thecontroller 106, its current status such as information that the physical vehicle is current driving, or the current location of the physical vehicle. Thecontroller 106 may update the rendering of thevirtual vehicle 107 corresponding to the physical vehicle based on the information physical vehicle is currently driving or the current location of the physical vehicle. For example, thecontroller 106 may update the color of thevirtual vehicle 107 to gray to indicate that the virtual vehicle is not available. As another example, thecontroller 106 may remove thevirtual vehicle 107 from the virtual dealership. -
FIG. 2 illustrates an example of the user viewing thevirtual car dealership 103 through aVR device 112, according to one or more embodiments described and illustrated herein. - The
graphical user interface 104 may be a user interface of auser device 111. Theuser device 111 may be a phone, tablet, a computer, or any other device that may display thegraphical user interface 104. Theuser device 111 may include thegraphical user interface 104 and thecontroller 106. In some embodiments, theuser device 111 may include theVR device 112. TheVR device 112 may include aVR headset 113 that the user wears to view thevirtual space 102. In the exemplary embodiment ofFIG. 2 , the user is viewing thevirtual car dealership 103. However, as discussed hereinabove, the user may also view virtual shopping centers, virtual concerts, virtual sports games, or any other suitablevirtual space 102 the user wishes to view on thegraphical user interface 104 of theuser device 111. - The user may interact with the
virtual space 102. Referring again toFIG. 1 , the user may interact with thevirtual space 102 through thegraphical user interface 104 when theuser device 111 includes agraphical user interface 104 that is touch-screen. Thus, the user may select certain objects, change the view, or navigate thevirtual space 102 through thegraphical user interface 104 of theuser device 111. In some embodiments, theuser device 111 may include amicrophone 114, such that the user can speak commands into theuser device 111 in order to interact with thevirtual space 102. Referring again toFIG. 2 , in embodiments in which theuser device 111 is theVR device 112, theVR device 112 may includehand controllers 115, allowing the user can interact with thevirtual space 102 through thehand controllers 115. -
FIG. 3 depicts a schematic diagram of thesystem 100 for displaying thevirtual space 102, according to one or more embodiments shown and described herein. - As discussed hereinabove, the
virtual space 102 may be displayed through thecontroller 106. Thecontroller 106 may be included in a server, e.g., a cloud server, an edge server, and the like. Thecontroller 106 may be programmed to receive user information from a user profile, determine display conditions based on the user information, determine user accessibility to virtual features based on at least one user metric, and display thevirtual space 102 on thegraphical user interface 104. One or more of thecontroller 106 functions may be executed in real time. Referring now toFIG. 3 , thecontroller 106 may include aprocessor 118 and amemory unit 120. Theprocessor 118 can be any device capable of executing machine-readable andexecutable instructions 128. Theprocessor 118 may be a controller, an integrated circuit, a microchip, a computer, or any other computing device. Theprocessor 118 is coupled to acommunication path 132 that provides signal interconnectivity between various modules of thesystem 100. Thecommunication path 132 may communicatively couple any number ofprocessors 118 with one another, and allow the modules coupled to thecommunication path 132 to operate in a distributed computing environment. Each of the modules may operate as a node that may send and/or receive data. As used herein, the term “communicatively coupled” means that coupled components are capable of exchanging data signals with one another such as electrical signals via a conductive medium, electromagnetic signals via air, optical signals via optical waveguides, and the like. - The
communication path 132 may be formed from any medium that is capable of transmitting a signal such as conductive wires, conductive traces, optical waveguides, or the like. In some embodiments, thecommunication path 132 may facilitate the transmission of wireless signals, such as WiFi, Bluetooth®, Near Field Communication (NFC), and the like. Thecommunication path 132 may be formed from a combination of mediums capable of transmitting signals. In one embodiment, thecommunication path 132 comprises a combination of conductive traces, conductive wires, connectors, and buses that cooperate to permit the transmission of electrical data signals to components such as processors, memories, sensors, input devices, output devices, and communication devices. Thecommunication path 132 may comprise a vehicle bus, such as a LIN bus, a CAN bus, a VAN bus, and the like. Additionally, it is noted that the term “signal” means a waveform (e.g., electrical, optical, magnetic, mechanical, or electromagnetic), such as DC, AC, sinusoidal-wave, triangular-wave, square-wave, vibration, and the like, capable of traveling through a medium. - The
memory unit 120 may contain non-transitory computer-readable medium comprising RAM, ROM, flash memories, hard drives, or any device capable of storing machine-readable andexecutable instructions 128 such that the machine-readable andexecutable instructions 128 can be accessed by theprocessor 118. The machine-readable andexecutable instructions 128 may comprise logic or algorithm(s) written in any programming language of any generation (e.g., 1GL, 2GL, 3GL, 4GL, or 5GL) such as, for example, machine language that may be directly executed by theprocessor 118, or assembly language, object-oriented programming (OOP), scripting languages, microcode, etc., that may be compiled or assembled into machine-readable andexecutable instructions 128 and stored in thememory unit 120. The machine-readable andexecutable instructions 128 may be written in a hardware description language (HDL), such as logic implemented via either a field-programmable gate array (FPGA) configuration or an application-specific integrated circuit (ASIC), or their equivalents. The methods described herein may be implemented in any conventional computer programming language, as pre-programmed hardware elements, or as a combination of hardware and software components. Theprocessor 118 along with thememory unit 120 may operate as thecontroller 106 for thesystem 100. - Still referring to
FIG. 2 , thecontroller 106 may be communicatively coupled to the user device 111 (depicted inFIG. 1 ), a processor of the user vehicle 134 (discussed further below), a weather reporting system 116 (discussed further below), or each of the foregoing, by anetwork 130. In one embodiment, thenetwork 130 may include one or more computer networks (e.g., a personal area network, a local area network, or a wide area network), cellular networks, satellite networks and/or a global positioning system and combinations thereof. Thecontroller 106 may be communicatively coupled to thenetwork 130 via a wide area network, a local area network, a personal area network, a cellular network, a satellite network, etc. Suitable local area networks may include wired Ethernet and/or wireless technologies such as Wi-Fi. Suitable personal area networks may include wireless technologies such as IrDA, Bluetooth®, Wireless USB, Z-Wave, ZigBee, and/or other near field communication protocols. Suitable cellular networks include, but are not limited to, technologies such as LTE, WiMAX, UMTS, CDMA, and GSM. - As discussed hereinabove, the
controller 106 may receive user information from the user profile. The user information on the user profile may be manually entered by the user through theuser device 111. In other embodiments, the user may input a name and address into the user profile, and the user information may be autocompleted based on auser profile database 124 stored on thememory unit 120 that stores user information. Thecontroller 106 may also be communicatively coupled to anexternal user application 122. - The
external user application 122 may be an application installed on theuser device 111. Theexternal user application 122 may include a video-streaming application, a gaming application, a ride-share application, or any otherexternal user application 122 that may be installed on theuser device 111. Thecontroller 106 may receive user application metrics from theexternal user application 122. The user application metrics and theexternal user application 122 are discussed further below. - The user information may include a user location, a user design preference, current vehicle information, at least one user metric, or any other user information that may be stored on the user profile. The user location may be determined through a global positioning system (GPS) of the
user device 111. The user location may include a country, state, city, or zip code that the user is located in. In some embodiments, the user may set the user location to a location different than that of the physical location of the user determined through the GPS of theuser device 111. For example, the user may set the location as New York City, even though the GPS of theuser device 111 has determined that the user location is West Virginia. - The user information may also include user demographic information on the user profile. The user demographic information may include age, gender, marital status, or any other demographic information that may be stored on the user profile. The user demographic information may assist the
controller 106 in determining the display conditions to display on thegraphical user interface 104, as discussed further below. - The user design preference may include preferences on how the user prefers the
virtual space 102 to appear. In some embodiments, the user design preference includes a preferred architecture. For example, the user may select the preferred architecture from a plurality of architectures listed on thegraphical user interface 104. The plurality of architectures may include modern, colonial, Victorian, or any other architecture type that the user may prefer. The user preferences may also include a brightness of colors (i.e., light or dark colors), a time of day, or any other preference the user may have when viewing thevirtual space 102. Thevirtual space 102 may be designed to include the preferred architecture. For example, if the preferred architecture is modern, thevirtual buildings 108 may be of a modern architecture. On the other hand, if the preferred architecture is colonial, thevirtual buildings 108 in thevirtual space 102 may be of a colonial architecture. - A plurality of user profiles may be included on the
user profile database 124 stored on thememory unit 120. The plurality of user profiles include user information corresponding to the users in each of the user profiles. In some embodiments, the user design preference for a new user may be automatically determined by thecontroller 106 by comparing the user information of the user profile of the new user with the user information stored on the plurality of user profiles of users that have similar background as the new user on theuser profile database 124. For example, thecontroller 106 may recognize that user profiles in the plurality of user profiles that are male, 50 years old, and located in Cleveland, Ohio, often select a colonial architecture from the plurality of architectures. Based on this recognition, thecontroller 106 may automatically determine the user design preference as the colonial architecture type for the user profile with similar user information (i.e., male, 48 years old, located in Columbus, Ohio). - The at least one user metric may be based on metrics of a user vehicle. The user vehicle may be a vehicle including an automobile or any other passenger or non-passenger vehicle such as, for example, a terrestrial, aquatic, and/or airborne vehicle. In some embodiments, the user vehicle may be an autonomous driving vehicle. The user vehicle may be an autonomous vehicle that navigates its environment with limited human input or without human input. The processor of the
user vehicle 134 may be communicatively coupled to the controller 106 (depicted inFIG. 3 ). As such, thecontroller 106 may monitor a plurality of driving metrics recorded by the processor of theuser vehicle 134 and classify the plurality of driving metrics as the at least one user metric. For example, the at least one user metric may include a number of miles the user vehicle has been driven. The number of miles the user vehicle has been driven may include a total number of miles that the user vehicle has been driven. The number of miles of the user vehicle may be transmitted from the processor of theuser vehicle 134 to thecontroller 106 in real time. In other embodiments, the number of miles the user vehicle has been driven may be a number of miles the user vehicle has been driven since the user has purchased the user vehicle (in scenarios where the user buys a used vehicle) or an average number of miles the user vehicle is driven per day, week, month, or year. The at least one user metric may also include a number of consecutive days, weeks, or months the user vehicle has been driven. For example, the at least one user metric may be that the user vehicle is driven every day, or that the user vehicle is driven only Monday through Friday. - The processor of the
user vehicle 134 connected to thecontroller 106 may include a GPS of the user vehicle. Therefore, the at least one user metric may include alocation history 126 of the user vehicle. Thelocation history 126 of the user vehicle may be stored on the memory unit 120 (as depicted inFIG. 3 ) and may include routes the user vehicle has driven, a number of counties or states the user vehicle has been driven to, a number of National Parks the user vehicle has been driven in, or any othersuitable location history 126 of the user vehicle. - As discussed hereinabove, after receiving the user information from the user profile, the
controller 106 may also be programmed to determine display conditions based on the user information. The display conditions correspond to how thevirtual space 102 appears to the user through thegraphical user interface 104. The display conditions may directly correspond to the user information from the user profile in order to replicate an environment the user is in. For example, if the user information includes that the user location as Denver, Colorado, and the user design preference is modern, the display conditions may include a mountainous background andvirtual buildings 108 in thevirtual space 102 may be rendered in a modern architectural style. The display conditions may also include a weather condition or a time of day. - The time of day may change the display conditions in the
virtual space 102, such as an amount of sunlight. For example, if the time of day is 9 PM, no sunlight may be displayed in thevirtual space 102 and the display conditions may be adjusted to nighttime. The weather condition may also effect the display conditions in thevirtual space 102. The weather condition may be retrieved from theweather reporting system 116 communicatively coupled to thecontroller 106 through the communication path 132 (depicted inFIG. 2 ). Thecontroller 106 may obtain information from theweather reporting system 116 by querying theweather reporting system 116 with the user location. Thecontroller 106 may then adjust the display conditions in thevirtual space 102 based on the weather conditions at the user location. - The
controller 106 may also be programmed to determine user accessibility to virtual features based on the user information or the at least one user metric. The virtual features may include features of thevirtual space 102. For example, the virtual features may include a locked virtual room. The locked virtual room may include discounts that a user can select from once the locked virtual room has become unlocked. The virtual features such as the locked virtual room may only be unlocked/accessible if the user information or the at least one user metric meets a threshold condition. - In some embodiments, the user accessibility to the virtual features (i.e. the threshold condition) may be based on whether the user vehicle has entered within a defined location. For example, user accessibility to virtual features of a beach background in the
virtual space 102 may be based on whether the user vehicle has travelled within 5 miles of an ocean coastline. In other embodiments, user accessibility to virtual features of thevirtual space 102 may be based on whether the user vehicle has entered within a certain number of defined locations. For example, accessibility of virtual features in thevirtual space 102 may be based on whether the user vehicle has entered within the defined location of 5 National Parks. Upon the user vehicle entering the defined location of 5 National Parks, a notification that the virtual features are now accessible may be transmitted to theuser device 111 or to the processor of theuser vehicle 134. Theuser device 111 or theuser vehicle 134 may display the accessible virtual features on a screen of theuser device 111 or theuser vehicle 134. - In other embodiments, the user accessibility to virtual features may be based on the at least one user metric of the number of miles the user vehicle has been driven. For example, a virtual room in the
virtual space 102 described above may be locked if the user vehicle has been driven under 100,000 miles. When the user vehicle has been driven 100,000 miles or over, the virtual room may be unlocked and the user may access the virtual room. The unlocked virtual room may include discounts offered to the user, such as a no down-payment offer on a new vehicle, a low interest rate on financing, or $1,000 off a new vehicle. The user may be able to select one, or all of the discounts offered to the user in the unlocked virtual room. When the user vehicle drives over 100,000 miles and the virtual room is unlocked while the user vehicle is driving, thecontroller 106 may provide a route to a nearby dealership that offers discounts to the user vehicle. -
FIG. 4 depicts a flowchart for amethod 400 for displaying thevirtual space 102, according to one or more embodiments shown and described herein. - Referring now to
FIG. 4 , a flowchart for themethod 400 for displaying thevirtual space 102 is depicted. Themethod 400 includes receiving the user information from the user profile inblock 402. As discussed hereinabove, the user information includes the user location, the user design preference, and the at least one user metric. Themethod 400 further includes determining the display conditions based on the user information in block 404, determining user accessibility to the virtual features based on the at least one user metric in block 406, and displaying thevirtual space 102 including the virtual features on thegraphical user interface 104 based on the display conditions and the user accessibility inblock 408. - The
method 400 may include receiving user input. The user input may be received through theuser device 111, such as through thegraphical user interface 104, thehand controllers 115 of theVR device 112, or themicrophone 114. Themethod 400 may also include connecting to the processor of theuser vehicle 134 that may record the plurality of driving metrics. Themethod 400 may also include unlocking the virtual room based on the at least one user metric. - The
method 400 may also include connecting theexternal user application 122 to thecontroller 106 and receiving the user application metrics from theexternal user application 122. The user application metrics may be used to determine the display conditions of thevirtual space 102. In some embodiments, the locked virtual room may be accessed/unlocked based on the user application metrics. The user application metrics may relate to a frequency of use of theexternal user application 122. For example, if theexternal user application 122 is a ride-sharing service, the user application metrics may include a number of rides the user has ordered from the ride-sharing service. If the user has ordered a number of rides over a threshold number of rides, themethod 400 may include unlocking the virtual room. If theexternal user application 122 is a gaming application, the user application metrics may include a gaming level the user has completed; if the user has completed a gaming level over a threshold gaming level, themethod 400 may include unlocking the virtual room. - For the purposes of describing and defining the present disclosure, it is noted that reference herein to a variable being a “function” of a parameter or another variable is not intended to denote that the variable is exclusively a function of the listed parameter or variable. Rather, reference herein to a variable that is a “function” of a listed parameter is intended to be open ended such that the variable may be a function of a single parameter or a plurality of parameters.
- It is noted that recitations herein of a component of the present disclosure being “configured” or “programmed” in a particular way, to embody a particular property, or to function in a particular manner, are structural recitations, as opposed to recitations of intended use. More specifically, the references herein to the manner in which a component is “configured” or “programmed” denotes an existing physical condition of the component and, as such, is to be taken as a definite recitation of the structural characteristics of the component.
- It is noted that terms like “preferably,” “commonly,” and “typically,” when utilized herein, are not utilized to limit the scope of the claimed invention or to imply that certain features are critical, essential, or even important to the structure or function of the claimed invention. Rather, these terms are merely intended to identify particular aspects of an embodiment of the present disclosure or to emphasize alternative or additional features that may or may not be utilized in a particular embodiment of the present disclosure.
- The order of execution or performance of the operations in examples of the disclosure illustrated and described herein is not essential, unless otherwise specified. That is, the operations may be performed in any order, unless otherwise specified, and examples of the disclosure may include additional or fewer operations than those disclosed herein. For example, it is contemplated that executing or performing a particular operation before, contemporaneously with, or after another operation is within the scope of aspects of the disclosure.
- Having described the subject matter of the present disclosure in detail and by reference to specific embodiments thereof, it is noted that the various details disclosed herein should not be taken to imply that these details relate to elements that are essential components of the various embodiments described herein, even in cases where a particular element is illustrated in each of the drawings that accompany the present description. Further, it will be apparent that modifications and variations are possible without departing from the scope of the present disclosure, including, but not limited to, embodiments defined in the appended claims. More specifically, although some aspects of the present disclosure are identified herein as preferred or particularly advantageous, it is contemplated that the present disclosure is not necessarily limited to these aspects.
Claims (20)
1. A computer implemented system for displaying a virtual space, the system comprising:
a graphical user interface; and
a controller programmed to:
receive user information from a user profile, wherein the user information comprises a user location, a user design preference, and at least one user metric;
determine display conditions based on the user information;
determine user accessibility to virtual features based on the at least one user metric; and
display the virtual space including the virtual features on the graphical user interface based on the display conditions and the user accessibility, wherein the virtual space is displayed as a virtual 3D rendered image.
2. The computer implemented system of claim 1 , wherein the at least one user metric is a number of miles a user vehicle has been driven.
3. The computer implemented system of claim 1 , wherein the at least one user metric includes a number of consecutive days a user vehicle has been driven.
4. The computer implemented system of claim 1 , wherein the at least one user metric includes a location history of a user vehicle.
5. The computer implemented system of claim 4 , wherein the user accessibility to the virtual features is based on whether the user vehicle has entered within a defined location.
6. The computer implemented system of claim 1 , further comprising an external user application communicatively coupled to the controller.
7. The computer implemented system of claim 1 , further comprising a user device, the user device comprising:
the graphical user interface; and
the controller.
8. The computer implemented system of claim 7 , wherein the user device is a virtual reality device.
9. The computer implemented system of claim 1 , wherein the virtual features comprise a locked virtual room.
10. The computer implemented system of claim 9 , wherein the locked virtual room comprises discounts a user can select from.
11. The computer implemented system of claim 1 , wherein the user design preference is a preferred architecture.
12. The computer implemented system of claim 11 , wherein the virtual space is designed to comprise the preferred architecture.
13. The computer implemented system of claim 1 , wherein the display conditions include a weather condition or a time of day.
14. A computer implemented method for displaying a virtual space, the method comprising:
receiving user information from a user profile, wherein the user information comprises a user location, a user design preference, and at least one user metric;
determining display conditions based on the user information;
determining user accessibility to virtual features based on the at least one user metric; and
displaying the virtual space including the virtual features on a graphical user interface based on the display conditions and the user accessibility, wherein the virtual space is displayed as a virtual 3D rendered image.
15. The computer implemented method of claim 14 , further comprising unlocking a virtual room based on the at least one user metric.
16. The computer implemented method of claim 14 , further comprising receiving a user input.
17. The computer implemented method of claim 16 , wherein the user input comprises the user design preference.
18. The computer implemented method of claim 16 , wherein the user input is received through a user device, wherein the user device is a virtual reality device.
19. The computer implemented method of claim 14 , further comprising:
connecting to an external user application; and
receiving user application metrics from the external user application.
20. The computer implemented method of claim 14 , further comprising connecting to a processor of a user vehicle, wherein the processor of the user vehicle records a plurality of driving metrics.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/186,553 US20240320908A1 (en) | 2023-03-20 | 2023-03-20 | Systems, methods, and non-transitory computer-readable mediums for displaying a virtual space |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/186,553 US20240320908A1 (en) | 2023-03-20 | 2023-03-20 | Systems, methods, and non-transitory computer-readable mediums for displaying a virtual space |
Publications (1)
Publication Number | Publication Date |
---|---|
US20240320908A1 true US20240320908A1 (en) | 2024-09-26 |
Family
ID=92802840
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/186,553 Pending US20240320908A1 (en) | 2023-03-20 | 2023-03-20 | Systems, methods, and non-transitory computer-readable mediums for displaying a virtual space |
Country Status (1)
Country | Link |
---|---|
US (1) | US20240320908A1 (en) |
Citations (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050277455A1 (en) * | 2004-06-10 | 2005-12-15 | Microsoft Corporation | Racing games and other games having garage, showroom, and test drive features |
US20070268299A1 (en) * | 2005-02-04 | 2007-11-22 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Attribute enhancement in virtual world environments |
US20080104018A1 (en) * | 2006-10-25 | 2008-05-01 | Bing Xia | Personalized Virtual Reality Home Screen for Mobile Devices |
US20080221776A1 (en) * | 2006-10-02 | 2008-09-11 | Mcclellan Scott | System and Method for Reconfiguring an Electronic Control Unit of a Motor Vehicle to Optimize Fuel Economy |
US20090170614A1 (en) * | 2007-12-26 | 2009-07-02 | Herrmann Mark E | System and method for collecting and using player information |
US20090259539A1 (en) * | 2008-04-15 | 2009-10-15 | International Business Machines Corporation | Proximity-based broadcast virtual universe system |
US20100169798A1 (en) * | 2008-12-29 | 2010-07-01 | Nortel Networks Limited | Visual Indication of User Interests in a Computer-Generated Virtual Environment |
US20110010998A1 (en) * | 2009-07-15 | 2011-01-20 | Adrian Elliott | Simultaneous movement system for a vehicle door ii |
US20110072367A1 (en) * | 2009-09-24 | 2011-03-24 | etape Partners, LLC | Three dimensional digitally rendered environments |
US20120124471A1 (en) * | 2010-09-23 | 2012-05-17 | Gusky Jeffrey S | Virtual tour, display and commerce |
US20140129080A1 (en) * | 2012-02-28 | 2014-05-08 | Recharge Solutions Int'l | System and method for recording driving patterns and suggesting purchasable vehicles |
US20140344014A1 (en) * | 2013-03-15 | 2014-11-20 | Joseph Peter MacInnis | Vehicle dealer management system apparatus and related methods |
US20150193726A1 (en) * | 2013-01-06 | 2015-07-09 | Dei Headquarters, Inc. | Vehicle inventory and customer relation mangement system and method |
US20150379775A1 (en) * | 2014-06-26 | 2015-12-31 | Audi Ag | Method for operating a display device and system with a display device |
US9412203B1 (en) * | 2013-01-22 | 2016-08-09 | Carvana, LLC | Systems and methods for generating virtual item displays |
US9697503B1 (en) * | 2011-04-22 | 2017-07-04 | Angel A. Penilla | Methods and systems for providing recommendations to vehicle users to handle alerts associated with the vehicle and a bidding market place for handling alerts/service of the vehicle |
US20180067641A1 (en) * | 2016-09-01 | 2018-03-08 | PIQPIQ, Inc. | Social networking application for real-time selection and sorting of photo and video content |
US20190012603A1 (en) * | 2017-07-05 | 2019-01-10 | Ford Global Technologies, Llc | Method and apparatus for behavior-based vehicle purchase recommendations |
US20190073703A1 (en) * | 2017-02-23 | 2019-03-07 | Sandeep Aggarwal | Methods and systems of an online vehicle sale website |
US20190369742A1 (en) * | 2018-05-31 | 2019-12-05 | Clipo, Inc. | System and method for simulating an interactive immersive reality on an electronic device |
US20190384865A1 (en) * | 2018-06-14 | 2019-12-19 | International Business Machines Corporation | Intelligent design structure selection in an internet of things (iot) computing environment |
US20200134652A1 (en) * | 2018-10-26 | 2020-04-30 | International Business Machines Corporation | User interface adjustments based on internet-of-things engagement |
US20210065275A1 (en) * | 2019-08-28 | 2021-03-04 | Toyota Motor North America, Inc. | Systems and methods for making vehicle purchase recommendations based on a user preference profile |
US20210312552A1 (en) * | 2016-09-15 | 2021-10-07 | Simpsx Technologies Llc | Virtual Reality, Augmented Reality, Mixed Reality Data Exchange Social Network with Multi Dimensional Map Tile Porting |
US11218522B1 (en) * | 2020-08-28 | 2022-01-04 | Tmrw Foundation Ip S. À R.L. | Data processing system and method using hybrid system architecture for image processing tasks |
US11423619B2 (en) * | 2020-03-25 | 2022-08-23 | Volvo Car Corporation | System and method for a virtual showroom |
US20220331698A1 (en) * | 2015-01-28 | 2022-10-20 | Mark Krietzman | Leveraging online game goals and economy to drive off-line behaviors |
US20220392312A1 (en) * | 2021-06-08 | 2022-12-08 | Aristocrat Technologies, Inc. | Network-based gameplay interaction system |
US11798244B1 (en) * | 2022-12-07 | 2023-10-24 | Nant Holdings Ip, Llc | Location-based digital token management systems, methods, and apparatus |
US20240095418A1 (en) * | 2022-09-15 | 2024-03-21 | GM Global Technology Operations LLC | System and method for an augmented-virtual reality driving simulator using a vehicle |
-
2023
- 2023-03-20 US US18/186,553 patent/US20240320908A1/en active Pending
Patent Citations (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050277455A1 (en) * | 2004-06-10 | 2005-12-15 | Microsoft Corporation | Racing games and other games having garage, showroom, and test drive features |
US20070268299A1 (en) * | 2005-02-04 | 2007-11-22 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Attribute enhancement in virtual world environments |
US20080221776A1 (en) * | 2006-10-02 | 2008-09-11 | Mcclellan Scott | System and Method for Reconfiguring an Electronic Control Unit of a Motor Vehicle to Optimize Fuel Economy |
US20080104018A1 (en) * | 2006-10-25 | 2008-05-01 | Bing Xia | Personalized Virtual Reality Home Screen for Mobile Devices |
US20090170614A1 (en) * | 2007-12-26 | 2009-07-02 | Herrmann Mark E | System and method for collecting and using player information |
US20090259539A1 (en) * | 2008-04-15 | 2009-10-15 | International Business Machines Corporation | Proximity-based broadcast virtual universe system |
US20100169798A1 (en) * | 2008-12-29 | 2010-07-01 | Nortel Networks Limited | Visual Indication of User Interests in a Computer-Generated Virtual Environment |
US20110010998A1 (en) * | 2009-07-15 | 2011-01-20 | Adrian Elliott | Simultaneous movement system for a vehicle door ii |
US20110072367A1 (en) * | 2009-09-24 | 2011-03-24 | etape Partners, LLC | Three dimensional digitally rendered environments |
US20120124471A1 (en) * | 2010-09-23 | 2012-05-17 | Gusky Jeffrey S | Virtual tour, display and commerce |
US9697503B1 (en) * | 2011-04-22 | 2017-07-04 | Angel A. Penilla | Methods and systems for providing recommendations to vehicle users to handle alerts associated with the vehicle and a bidding market place for handling alerts/service of the vehicle |
US20140129080A1 (en) * | 2012-02-28 | 2014-05-08 | Recharge Solutions Int'l | System and method for recording driving patterns and suggesting purchasable vehicles |
US20150193726A1 (en) * | 2013-01-06 | 2015-07-09 | Dei Headquarters, Inc. | Vehicle inventory and customer relation mangement system and method |
US9412203B1 (en) * | 2013-01-22 | 2016-08-09 | Carvana, LLC | Systems and methods for generating virtual item displays |
US20140344014A1 (en) * | 2013-03-15 | 2014-11-20 | Joseph Peter MacInnis | Vehicle dealer management system apparatus and related methods |
US20150379775A1 (en) * | 2014-06-26 | 2015-12-31 | Audi Ag | Method for operating a display device and system with a display device |
US20220331698A1 (en) * | 2015-01-28 | 2022-10-20 | Mark Krietzman | Leveraging online game goals and economy to drive off-line behaviors |
US20180067641A1 (en) * | 2016-09-01 | 2018-03-08 | PIQPIQ, Inc. | Social networking application for real-time selection and sorting of photo and video content |
US20210312552A1 (en) * | 2016-09-15 | 2021-10-07 | Simpsx Technologies Llc | Virtual Reality, Augmented Reality, Mixed Reality Data Exchange Social Network with Multi Dimensional Map Tile Porting |
US20190073703A1 (en) * | 2017-02-23 | 2019-03-07 | Sandeep Aggarwal | Methods and systems of an online vehicle sale website |
US20190012603A1 (en) * | 2017-07-05 | 2019-01-10 | Ford Global Technologies, Llc | Method and apparatus for behavior-based vehicle purchase recommendations |
US20190369742A1 (en) * | 2018-05-31 | 2019-12-05 | Clipo, Inc. | System and method for simulating an interactive immersive reality on an electronic device |
US20190384865A1 (en) * | 2018-06-14 | 2019-12-19 | International Business Machines Corporation | Intelligent design structure selection in an internet of things (iot) computing environment |
US20200134652A1 (en) * | 2018-10-26 | 2020-04-30 | International Business Machines Corporation | User interface adjustments based on internet-of-things engagement |
US20210065275A1 (en) * | 2019-08-28 | 2021-03-04 | Toyota Motor North America, Inc. | Systems and methods for making vehicle purchase recommendations based on a user preference profile |
US11423619B2 (en) * | 2020-03-25 | 2022-08-23 | Volvo Car Corporation | System and method for a virtual showroom |
US11218522B1 (en) * | 2020-08-28 | 2022-01-04 | Tmrw Foundation Ip S. À R.L. | Data processing system and method using hybrid system architecture for image processing tasks |
US20220392312A1 (en) * | 2021-06-08 | 2022-12-08 | Aristocrat Technologies, Inc. | Network-based gameplay interaction system |
US20240095418A1 (en) * | 2022-09-15 | 2024-03-21 | GM Global Technology Operations LLC | System and method for an augmented-virtual reality driving simulator using a vehicle |
US11798244B1 (en) * | 2022-12-07 | 2023-10-24 | Nant Holdings Ip, Llc | Location-based digital token management systems, methods, and apparatus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180066959A1 (en) | Wearable sensor data to improve map and navigation data | |
AU2020239623B2 (en) | Systems and methods for determining an estimated time of arrival | |
US11037263B2 (en) | Systems and methods for displaying an identity relating to a service request | |
DE112016002580B4 (en) | MOBILE GEOGRAPHIC APPLICATION IN AUTOMOTIVE ENVIRONMENT | |
US9057624B2 (en) | System and method for vehicle navigation with multiple abstraction layers | |
JP7541051B2 (en) | Methods for improving traffic visualization | |
JP2020098650A (en) | System and method for displaying vehicle information for on-demand service | |
JP7047096B2 (en) | Systems and methods for determining estimated arrival times for online-to-offline services | |
TW201800287A (en) | Data push method, device and device | |
US20200318986A1 (en) | In-vehicle apparatus and information presentation method | |
WO2020238691A1 (en) | Image processing method and apparatus, and electronic device and storage medium | |
CN110782652A (en) | Speed prediction system and method | |
US11668580B2 (en) | System and method of creating custom dynamic neighborhoods for individual drivers | |
CN111465936A (en) | System and method for determining new roads on a map | |
US10956941B2 (en) | Dynamic billboard advertisement for vehicular traffic | |
CN110321854B (en) | Method and apparatus for detecting target object | |
CN109716715B (en) | System and method for feed stream transmission | |
US20240320908A1 (en) | Systems, methods, and non-transitory computer-readable mediums for displaying a virtual space | |
US20200326201A1 (en) | In-vehicle apparatus, information provision system, and information presentation method | |
KR20220155197A (en) | Method, information processing device, and non-transitory storage medium storing program | |
CN112050822B (en) | Method, system and device for generating driving route | |
CN111143486A (en) | Service position acquisition method and device, electronic equipment and storage medium | |
WO2022203079A1 (en) | Control method, program, information processing device, and information provision method | |
CN111695954A (en) | Vehicle screening method and system of automobile leasing platform and electronic equipment | |
US11776064B2 (en) | Driver classification systems and methods for obtaining an insurance rate for a vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA CONNECTED NORTH AMERICA, INC., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZAHID, IMAD;DENTHUMDAS, SHRAVANTHI;MCCLUNG, MARK ANTHONY;REEL/FRAME:063036/0232 Effective date: 20230314 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |