US8806336B2 - Facilitating display of a menu and selection of a menu item via a touch screen interface - Google Patents
Facilitating display of a menu and selection of a menu item via a touch screen interface Download PDFInfo
- Publication number
- US8806336B2 US8806336B2 US12/821,399 US82139910A US8806336B2 US 8806336 B2 US8806336 B2 US 8806336B2 US 82139910 A US82139910 A US 82139910A US 8806336 B2 US8806336 B2 US 8806336B2
- Authority
- US
- United States
- Prior art keywords
- operating tool
- pointing direction
- menu
- display panel
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 230000010365 information processing Effects 0.000 claims abstract description 47
- 230000008859 change Effects 0.000 claims abstract description 40
- 238000003672 processing method Methods 0.000 claims description 13
- 238000000034 method Methods 0.000 claims description 7
- 230000003993 interaction Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 description 17
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention relates to an information processing apparatus and an information processing method.
- an information processing apparatus that detects movement of an operating tool such as a finger of a user on a display panel and performs interaction with the user. For example, when the user selects a desired GUI (Graphical User Interface) object on the display panel, the information processing apparatus displays an operation menu containing one or more operation items selectable for the object and asks the user to select a desired operation item. Then, when the operating tool is in touch with a display area of the object for a predetermined time period, the information processing apparatus recognizes input of the menu starting operation and displays the operation menu.
- GUI Graphic User Interface
- the user has to keep a touch state of an operating tool for a predetermined time period until the operation menu is displayed.
- the predetermined period is shortened, it is ambiguous to discriminate between the general object selecting operation (tapping) and the menu starting operation.
- the user needs to perform complicated operation to select the desired operation item, and it may not be necessarily said that the user can enjoy a favorable operation environment.
- the user needs to perform very many operations including selecting an option menu after selecting the object.
- the user in displaying the operation menu after selection of a plurality of objects and selecting a desired operation item, the user also needs to perform the complicated operation.
- an information processing apparatus including an operating tool detector for detecting a touch state of an operating tool with a display panel, a display controller for, when change of a pointing direction of the operating tool is detected by the operating tool detector on an object selected on the display panel, controlling the display panel to display near the object an operation menu containing one or more operation items selectable for the object, and an operation item selecting portion for, when the operation menu is displayed, selecting one of the operation items in accordance with the change in the pointing direction of the operating tool detected by the operating tool detector from the operation menu.
- the operation item selecting portion may select the operation item on an extension of the pointing direction of the operating tool from the operation menu.
- the operation item selecting portion may select the operation item placed in a direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool by a coefficient a, the coefficient a being larger than 1, from the operation menu.
- the display controller may control the display panel to rotate the operation menu by a change amount obtained by multiplying the change amount of the pointing direction of the operating tool by a coefficient (1 ⁇ a) and then, display the operation menu.
- the operation item selected by the operation item selecting portion may be executed.
- the display controller may control the display panel to stop display of the operation menu.
- an information processing method including the steps of when change in a pointing direction of an operating tool is detected on an object selected on a display panel, controlling the display panel to display, near the object, an operation menu containing one or more operation items selectable for the object, and when the operation menu is displayed, selecting one of the operation items in accordance with the change of the pointing direction of the operating tool from the operation menu.
- an information processing apparatus and an information processing method capable of facilitating display of an operation menu for an object and selection of an operation item.
- FIG. 1 is a view illustrating an overview of an information processing apparatus according to an embodiment of the present invention
- FIG. 2 is a block diagram illustrating a principal functional structure of the information processing apparatus according to the embodiment of the present invention
- FIG. 3A is a view illustrating a detection result of an operating tool and a position of the operating tool on a display panel
- FIG. 3B is a view illustrating a detection result of an operating tool and a position of the operating tool on a display panel
- FIG. 3C is a view illustrating a detection result of an operating tool and a position of the operating tool on a display panel
- FIG. 4 is a flowchart illustrating an information processing method according to an embodiment of the present invention.
- FIG. 5 is a view illustrating a processing example by the information processing method (display of an operation menu);
- FIG. 6 is a view illustrating a processing example by the information processing method (execution of an operation item);
- FIG. 7 is a view illustrating a processing example by the information processing method (stop of display of the operation menu);
- FIG. 8 is a view illustrating a processing example by the information processing method (selection of an operation item).
- FIG. 9 is a view illustrating a first modification for selection of operation item
- FIG. 10 is a view illustrating a second modification for selection of operation item.
- FIG. 11 is a view illustrating another display example of the operation menu.
- FIG. 1 is a view illustrating an overview of an information processing apparatus 100 according to an embodiment of the present invention.
- the information processing apparatus 100 detects a touch state of an operating tool M such as a user finger with a display panel 101 .
- the information processing apparatus 100 is a personal computer, a PDA, a portable music player or the like.
- the information processing apparatus 100 has a built-in type display panel 101 , however, the information processing apparatus 100 may be connected to a display panel 101 via communication means.
- the information processing apparatus 100 controls the display panel 101 in such a manner that an operation menu OM containing one or more operation items I selectable for the object O is displayed near the object O. Then, while the operation menu OM is displayed, the information processing apparatus 100 selects an operation item I in accordance with change in the pointing direction of the operating tool M from the operation menu OM.
- the pointing direction of the operating tool M is changed on the object O and the operation menu OM is displayed.
- the operation item I (for example, operation item 16 ) is selected.
- the pointing direction of the operating tool is a direction pointed out by a finger, for example, when the operating tool is the finger.
- the operation item I and object O selected are illustrated hatched.
- a user can input a menu starting operation by changing the pointing direction of the operating tool M and the user does not need to keep the touch state of the operating tool M for a predetermined time period.
- the user can select a desired operation item I by changing the pointing direction of the operating tool M, and the user does not need to perform complicated operation in selecting of the operation item I.
- the user can perform operations of selecting an object O, displaying an operation menu OM and selecting an operation item I as a series of the operations efficiently.
- FIG. 2 is a block diagram illustrating a principal functional structure of the information processing apparatus 100 according to the embodiment of the present invention.
- the information processing apparatus 100 has the display panel 101 , an operating tool detector 107 , a storage 109 , a display controller 111 and a controller 113 .
- the display panel 101 functions as a touch sensor 103 and a display unit 105 .
- the touch sensor 103 detects a touch state of the operating tool M.
- the touch sensor 103 is an optical sensor, an electric capacitance sensor, a pressure sensor or any other sensor. In the following description, it is assumed that the touch sensor 103 detects the touch state of the operating tool M based on a light-receiving state of the display panel 101 .
- the display unit 105 displays processing results of applications, contents and an object O under control of the display controller 111 and particularly displays an operation menu OM containing one or more operation items I selectable for the object O selected on the display panel 101 .
- the object is an object O that includes GUI, such as an icon, a button or a thumbnail.
- the operating tool detector 107 detects the touch state of the operating tool M with the display panel 101 by the touch sensor 103 .
- the operating tool detector 107 uses the light-receiving state of the display panel 101 as a basis to detect presence or absence of touch of the operating tool M with the display panel 101 , a touch position, a touch area and a pointing direction.
- the method of detecting the operating tool M by the touch sensor 103 will be described later.
- the storage 109 stores information processing programs, application program, object O data and the like and particularly stores data of the operation menu OM.
- the controller 113 controls the overall operation of the information processing apparatus 100 by controlling each portion by execution of an information processing program.
- the controller 113 has a function as an operation item selecting portion to select an operation item I from the operation menu OM in accordance with change in the pointing direction of the operating tool M detected by the operating tool detector 107 while the operation menu OM is displayed.
- change in the pointing direction of the operating tool M can be discriminated from conventional button down, button up, click, double click, touch, drag, drop, flick and the like. It is detected without interference with these operations.
- RGB pixels and light-receiving sensors are arranged in a matrix.
- the light-receiving sensors function as the touch sensor 103 to receive light emitted from the display panel 101 and reflected by the operating tool M and detect the touch state of the operating tool M based on the light-receiving state.
- the operating tool detector 107 performs digital processing on an output result of the touch sensor 103 thereby to generate a sensor image S.
- the operating tool detector 107 calculates a luminance value expressing the light-receiving state corresponding to each pixel based on the sensor image S, and processes the luminance value into a binary value with use of a predetermined threshold. In the binary processing, the luminance value of each pixel is classified into first or second category, and each area of the sensor image S is classified into first or second area A 1 or A 2 corresponding to respective categories.
- the first and second areas A 1 and A 2 correspond to the large and small luminance areas, which are specified as a touch area and a non-touch area of the operating tool M, respectively.
- the operating tool detector 107 uses existence of the first area A 1 as a basis to detect presence or absence of touch of the operating tool M with the display panel 101 . Besides, the operating tool detector 107 calculates the center-of-gravity position and area of the first area A 1 thereby to detect each of the touch position and touch area of the operating tool M.
- the operating tool detector 107 specifies a long axis direction D of the first area A 1 thereby to detect the pointing direction of the operating tool M.
- the pointing direction of the operating tool M is defined as a direction of pointing out an upper part of the display panel 101 along the long axis direction D of the first area A 1 .
- the controller 113 calculates an angle difference between pointing directions of the operating tool M before and after rotation thereby to calculate the rotational angle of the operating tool M.
- FIGS. 3A to 3C are views illustrating detection results of the operating tool M and positions of the operating tool M on the display panel 101 .
- the touch area A 1 of a finger end as the operating tool M is grasped as an elliptic area A 1 on a sensor image S.
- the operating tool detector 107 specifies the long axis direction D of the elliptic area A 1 and detects as the pointing direction of the operating tool M a direction of pointing the upper part of the display panel 101 along the specified long axis direction D.
- the touch area A 1 of the finger end with the display panel 101 is grasped as an elliptic area A 1 in which the pointing direction of the finger is the long axis direction D.
- the finger end is rotated from the state of FIG. 3A and a touch area A 1 ′ of the rotated finger end is grasped as an elliptic area A 1 ′ on the sensor image S.
- the operating tool detector 107 specifies the long axis direction D of the elliptic area A 1 ′ and detects the direction of pointing the upper part of the display panel 101 along the specified long axis direction D as a pointing direction of the operating tool M after rotation. Then, the controller 113 uses an angle difference between pointing directions of the operating tool M before and after rotation as a basis to calculate the rotational angle of the operating tool M.
- a touch area A 1 ′′ of the finger end is grasped as an approximately circular area A 1 ′′ on the sensor image S.
- the operating tool detector 107 may not specify the long axis direction D of the touch area A 1 ′′ and the controller 113 regards it as a detection error.
- FIG. 4 is a flowchart illustrating an information processing method according to an embodiment of the present invention.
- FIGS. 5 to 8 are views illustrating processing examples of the information processing method.
- the operating tool detector 107 detects a touch state of the operating tool M for each detection frame (S 101 ).
- the controller 113 determines whether or not the touch state of the operating tool M is changed from that in the last detected frame (S 103 ). When the determination result is positive, the controller 113 performs the processing of step S 105 and later, while when the determination result is negative, it goes back to the processing of step S 101 .
- step S 105 the controller 113 determines whether or not the operation menu OM is displayed. When the determination result is positive, the controller 113 performs the processing of step S 107 and later. When the determination result is negative, it performs the processing of step S 115 .
- step S 107 the controller 113 determines whether or not the object O for display of the operation menu is selected on the display panel 101 .
- the object O is selected on the display panel 101 by tapping of the operating tool M or the like.
- the controller 113 determines whether or not the operating tool M is not moved predetermined distance or more on the selected object O and the operating tool M is rotated a predetermined angle or more (S 109 , S 111 ).
- a moving distance of the operating tool M is a change amount of the touch position of the operating tool M that has moved in touch with the display panel 101 .
- the rotational amount of the operating tool M means a change amount of the pointing direction of the operating tool M.
- movement for a predetermined distance or more means, for example, movement of the selected object O to the outside of display area.
- Rotation by a predetermined angle or more means, for example, rotation by such a rotational angle that input of the menu starting operation is not misidentified.
- step S 113 displays the operation menu OM (S 113 ) and goes back to the processing of step S 101 .
- the controller 113 goes back to the processing of step S 101 .
- the operation menu OM contains one or more operation items I selectable for the selected object O, which are displayed near the object O.
- a selected operation item I is brought into focus and, for example, the operation item I is displayed enlarged.
- the operation menu OM is displayed in consideration of the position of the operating tool M estimated from the pointing direction of the operating tool M so as to prevent the displayed operation item I from being covered with the operating tool M to reduce the visibility.
- labels of music albums 1 to 7 are displayed on the display panel 101 as objects O and the label of album 3 is selected by the operating tool M.
- the operation menu OM containing the operation items I for selecting from songs 1 to 7 stored in the album 3 is displayed.
- the operation item I 4 on an extension of the pointing direction of the operating tool M is selectable on the operation menu OM.
- step S 105 determines in step S 115 whether or not the operating tool M is changed into non-touch state. Then, when the determination result is positive, the controller 113 executes the operation item I selected on the operation menu OM (S 117 ), and it goes back to step S 101 .
- step S 119 the processing of step S 119 is performed.
- the operation item I selected on the operation menu OM is executed by changing the operating tool M into the non-touch state while the operation menu OM is displayed.
- replay of the operation item I 4 selected on the operation menu OM is started. Then, the user can easily instruct execution of the operation item I by bringing the operating tool M into the non-touch state.
- step S 115 determines whether or not the operating tool M is moved a predetermined distance or more.
- movement of a predetermined distance or more means, for example, movement to the outside of the display area of the operation menu OM.
- display of the operation menu OM is stopped (S 121 ), and it goes back to step S 101 .
- step S 123 the processing of step S 123 is performed.
- display of the operation menu OM is stopped by the operating tool M that has moved a predetermined distance or more while the operation menu OM is displayed.
- display of the operation menu OM is stopped.
- step S 119 determines whether or not the operating tool M is rotated a predetermined angle or more (S 123 ).
- rotation by a predetermined angle or more means, for example, rotation of the pointing direction of the operating tool M with a detection accuracy or more.
- selection of the operation item I is performed (S 125 ) and display of the operation menu OM and the like are updated. Further, the controller 113 returns back to the processing of step S 101 .
- the operation item I is selected in accordance with change in the pointing direction of the operating tool M while the operation menu OM is displayed. Then, on the operation menu OM, the focus is moved to the selected operation item I.
- the operation item I placed on the extension of the pointing direction of the operating tool M on the display panel 101 is selected. Then, the user can easily select the desired operation item I in accordance with the change in the pointing direction of the operating tool M.
- predetermined operation items I may be set to be selectable or all of them may be set to be unselectable.
- FIGS. 9 and 10 are views illustrating first and second modifications for selecting an operation item I.
- FIG. 11 is a view illustrating another display example of the operation menu OM.
- the pointing direction of the operating tool M is rotated 45° clockwise while the operation item I 4 on the extension of the pointing direction of the operating tool M is selected.
- the user can easily select a desired operation item I as compared with selecting of the operation item I on the extension of the pointing direction of the operating tool M.
- the operation item I is selected that is placed in the direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool M by a coefficient a.
- the operation item is selected by highly valuing the change of the pointing direction of the operating tool M
- the operability in selection is improved as compared with selecting of the operation item I on the extension of the pointing direction.
- the position of the selected operation item I does not match the pointing direction of the operating tool M (for example, in FIG. 9 , not the operation item I 6 but the operation item I 5 is positioned on the extension of the pointing direction of the operating tool M), it is difficult to select the operation item I by an intuitive operation.
- an operation item I is selected that is placed in the direction defined by a change amount obtained by multiplying a change amount of the pointing direction of the operating tool M by a coefficient a (1 ⁇ a), and the operation menu OM is rotated by a change amount obtained by multiplying the change amount of the pointing direction of the operating tool M by a coefficient (1 ⁇ a).
- the pointing direction of the operating tool M is rotated 45° clockwise while the operation item I 4 is selected on the extension of the pointing direction of the operating tool M.
- the operation item I 6 on the extension of the pointing direction of the operating tool M is selected.
- the user can easily select the desired operation item I by the intuitive operation as compared with selecting of the operation item I placed in the direction defined by the change amount obtained by multiplying the change amount of the pointing direction of the operating tool M by the coefficient a.
- FIG. 11 illustrates a display example of the operation menu OM containing one or more operation items I selectable for a plurality of objects O.
- FIG. 11 for example, statistical process of maximum, minimum, average, sum and the like is performed on data contained in a plurality of cells that form a spread sheet (object O).
- the user performs dragging of the operating tool M on the display panel 101 to select a plurality of cells containing data for statistical process, and then, rotates the operating tool M on the cell at the dragging end by a predetermined angle or more.
- the controller 113 recognizes input of the menu starting operation and displays an approximately sector-shaped operation menu OM around the cell at the end. Then, following the menu starting operation, the user can select the operation item I in accordance with change in the pointing direction of the operating tool M (for example, in FIG. 11 , the operation item I 3 is selected).
- the controller 113 does not recognize input of the menu starting operation as long as the change in the pointing direction is less than a predetermined angle.
- the information processing apparatus 100 controls the display panel 101 (display unit 105 ) to display the operation menu OM containing one or more operation items I selectable for the object O near the object O. Then, the information processing apparatus 100 selects an operation item I on the operation OM in accordance with the change in the pointing direction of the operating tool M while the operation menu OM is displayed.
- the user can input a menu starting operation by changing the pointing direction of the operating tool M, and does not need to keep the touch state of the operating tool M for a predetermined time period.
- the user can select a desired operation item I by changing the pointing direction of the operating tool M and does not need to perform a complicated operation in selecting of the operation item I.
- the user can perform operations of selecting of the object O, displaying of the operation menu OM and selecting of the operation item I as a series of the operations efficiently.
- the sensor may be an electrical capacitance sensor, a pressure sensor or any other touch sensor.
- the pointing direction of the operating tool M is detected based on the touch state of the operating tool M.
- the pointing direction of the operating tool M may be detected from the touch state and proximity state of the operating tool M.
- the sensor image as an output result of the touch/proximity sensor is processed into three-digit value to specify the touch area, proximity area and non-touch proximity area of the operating tool M.
- the center-of-gravity positions of the proximity area and the touch area are used as a basis to detect the direction toward the center of gravity in the touch area from the center of gravity of the proximity area as a pointing direction of the operating tool M.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
- [Patent Document 1] Japanese Patent Application Laid-Open No. 2005-352619
- [Patent Document 2] Japanese Patent Application Laid-Open No. 2007-80291
- [Patent Document 3] Japanese Patent Application Laid-Open No. 2007-226571
Claims (8)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/324,582 US20140351755A1 (en) | 2009-07-02 | 2014-07-07 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-158153 | 2009-07-02 | ||
JP2009158153A JP5402322B2 (en) | 2009-07-02 | 2009-07-02 | Information processing apparatus and information processing method |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/324,582 Continuation US20140351755A1 (en) | 2009-07-02 | 2014-07-07 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
Publications (2)
Publication Number | Publication Date |
---|---|
US20110004821A1 US20110004821A1 (en) | 2011-01-06 |
US8806336B2 true US8806336B2 (en) | 2014-08-12 |
Family
ID=42797427
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/821,399 Expired - Fee Related US8806336B2 (en) | 2009-07-02 | 2010-06-23 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
US14/324,582 Abandoned US20140351755A1 (en) | 2009-07-02 | 2014-07-07 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/324,582 Abandoned US20140351755A1 (en) | 2009-07-02 | 2014-07-07 | Facilitating display of a menu and selection of a menu item via a touch screen interface |
Country Status (3)
Country | Link |
---|---|
US (2) | US8806336B2 (en) |
EP (1) | EP2270642B1 (en) |
JP (1) | JP5402322B2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150177973A1 (en) * | 2013-12-19 | 2015-06-25 | Funai Electric Co., Ltd. | Selection device |
US9251722B2 (en) | 2009-07-03 | 2016-02-02 | Sony Corporation | Map information display device, map information display method and program |
US20160188129A1 (en) * | 2014-12-25 | 2016-06-30 | Fu Tai Hua Industry (Shenzhen) Co., Ltd. | Electronic device and method for displaying interface according to detected touch operation |
US10474352B1 (en) * | 2011-07-12 | 2019-11-12 | Domo, Inc. | Dynamic expansion of data visualizations |
US10726624B2 (en) | 2011-07-12 | 2020-07-28 | Domo, Inc. | Automatic creation of drill paths |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4865063B2 (en) * | 2010-06-30 | 2012-02-01 | 株式会社東芝 | Information processing apparatus, information processing method, and program |
EP2717142A4 (en) * | 2011-05-30 | 2015-01-07 | Honda Motor Co Ltd | INPUT DEVICE |
US10001898B1 (en) | 2011-07-12 | 2018-06-19 | Domo, Inc. | Automated provisioning of relational information for a summary data visualization |
US20130061122A1 (en) * | 2011-09-07 | 2013-03-07 | Microsoft Corporation | Multi-cell selection using touch input |
GB2509651B (en) * | 2011-10-11 | 2015-07-08 | Ibm | Object designation method, device and computer program |
JP5816516B2 (en) * | 2011-10-24 | 2015-11-18 | 京セラ株式会社 | Electronic device, control program, and process execution method |
US9645733B2 (en) * | 2011-12-06 | 2017-05-09 | Google Inc. | Mechanism for switching between document viewing windows |
JP2013161205A (en) | 2012-02-03 | 2013-08-19 | Sony Corp | Information processing device, information processing method and program |
CN104246677A (en) * | 2012-04-20 | 2014-12-24 | 索尼公司 | Information processing device, information processing method, and program |
WO2013187872A1 (en) * | 2012-06-11 | 2013-12-19 | Intel Corporation | Techniques for select-hold-release electronic device navigation menu system |
USD716314S1 (en) * | 2012-08-22 | 2014-10-28 | Microsoft Corporation | Display screen with graphical user interface |
KR102150289B1 (en) * | 2012-08-30 | 2020-09-01 | 삼성전자주식회사 | User interface appratus in a user terminal and method therefor |
US9696879B2 (en) | 2012-09-07 | 2017-07-04 | Google Inc. | Tab scrubbing using navigation gestures |
JP5844707B2 (en) | 2012-09-28 | 2016-01-20 | 富士フイルム株式会社 | Image display control device, image display device, program, and image display method |
JP6033061B2 (en) * | 2012-11-30 | 2016-11-30 | Kddi株式会社 | Input device and program |
KR102095039B1 (en) * | 2013-06-04 | 2020-03-30 | 삼성전자 주식회사 | Apparatus and method for receiving touch input in an apparatus providing a touch interface |
KR101862954B1 (en) | 2013-10-22 | 2018-05-31 | 노키아 테크놀로지스 오와이 | Apparatus and method for providing for receipt of indirect touch input to a touch screen display |
CN103905640A (en) * | 2014-03-12 | 2014-07-02 | 惠州Tcl移动通信有限公司 | Mobile terminal and false-dialing preventing method thereof |
US20150346998A1 (en) * | 2014-05-30 | 2015-12-03 | Qualcomm Incorporated | Rapid text cursor placement using finger orientation |
JP5729513B1 (en) * | 2014-06-06 | 2015-06-03 | 株式会社セガゲームス | Program and terminal device |
JP6260469B2 (en) * | 2014-06-25 | 2018-01-17 | 富士通株式会社 | Data sequence selection method, data sequence selection program, and portable terminal |
US10310675B2 (en) * | 2014-08-25 | 2019-06-04 | Canon Kabushiki Kaisha | User interface apparatus and control method |
US20160062636A1 (en) * | 2014-09-02 | 2016-03-03 | Lg Electronics Inc. | Mobile terminal and control method thereof |
JP6063437B2 (en) * | 2014-12-19 | 2017-01-18 | 株式会社スクウェア・エニックス | Program, computer apparatus, computer processing method, and system |
JP6501533B2 (en) * | 2015-01-26 | 2019-04-17 | 株式会社コロプラ | Interface program for icon selection |
JP6153990B2 (en) * | 2015-11-19 | 2017-06-28 | 富士フイルム株式会社 | Image display control program, image display control device, and image display control method |
US10191611B2 (en) * | 2015-11-27 | 2019-01-29 | GitSuite LLC | Graphical user interface defined cursor displacement tool |
JP2017060861A (en) * | 2016-12-16 | 2017-03-30 | 株式会社スクウェア・エニックス | Program, computer apparatus, computer processing method, and system |
KR102626169B1 (en) * | 2019-03-19 | 2024-01-18 | 삼성전자주식회사 | Display apparatus and controlling method of the display apparatus |
US11328223B2 (en) | 2019-07-22 | 2022-05-10 | Panasonic Intellectual Property Corporation Of America | Information processing method and information processing system |
CN111309206A (en) * | 2020-02-04 | 2020-06-19 | 北京达佳互联信息技术有限公司 | Data processing method and device, electronic equipment and storage medium |
Citations (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5805167A (en) * | 1994-09-22 | 1998-09-08 | Van Cruyningen; Izak | Popup menus with directional gestures |
US6346938B1 (en) | 1999-04-27 | 2002-02-12 | Harris Corporation | Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model |
JP2004037125A (en) | 2002-06-28 | 2004-02-05 | Clarion Co Ltd | System, method and program for presenting peripheral information in navigation |
JP2004233333A (en) | 2003-01-06 | 2004-08-19 | Alpine Electronics Inc | Stereoscopic display method for navigation, and navigation device |
JP2005352619A (en) | 2004-06-09 | 2005-12-22 | Fujitsu Ten Ltd | Function selector |
US7126579B2 (en) | 2000-08-24 | 2006-10-24 | Siemens Aktiengesellschaft | Method for requesting destination information and for navigating in a map view, computer program product and navigation unit |
JP2007080291A (en) | 2006-11-27 | 2007-03-29 | Fujitsu Ltd | Input processing method and input processing apparatus for implementing the same |
JP2007226571A (en) | 2006-02-23 | 2007-09-06 | Kyocera Mita Corp | Electronic equipment device depending on touch panel input, and program performing input operation of touch panel |
EP1921419A1 (en) | 2005-09-02 | 2008-05-14 | Matsushita Electric Industrial Co., Ltd. | Image display device and image generation device |
US7376510B1 (en) | 2004-11-05 | 2008-05-20 | Navteq North America, Llc | Map display for a navigation system |
US20080294332A1 (en) | 2007-01-17 | 2008-11-27 | 3-D-V-U Israel (2000) Ltd. | Method for Image Based Navigation Route Corridor For 3D View on Mobile Platforms for Mobile Users |
US20080313538A1 (en) * | 2007-06-12 | 2008-12-18 | Microsoft Corporation | Visual Feedback Display |
JP2009025041A (en) | 2007-07-17 | 2009-02-05 | Pioneer Electronic Corp | System and method for navigation |
EP2068235A2 (en) | 2007-12-07 | 2009-06-10 | Sony Corporation | Input device, display device, input method, display method, and program |
US20090281720A1 (en) | 2008-05-08 | 2009-11-12 | Gabriel Jakobson | Method and system for displaying navigation information on an electronic map |
US20100077354A1 (en) * | 2006-01-27 | 2010-03-25 | Microsoft Corporation | Area Selectable Menus |
US20100079405A1 (en) * | 2008-09-30 | 2010-04-01 | Jeffrey Traer Bernstein | Touch Screen Device, Method, and Graphical User Interface for Moving On-Screen Objects Without Using a Cursor |
US20100079373A1 (en) * | 2008-09-26 | 2010-04-01 | Nintendo Co., Ltd. | Storage medium storing image processing program for implementing controlled image display according to input coordinate, and information processing device |
US20110001628A1 (en) | 2009-07-03 | 2011-01-06 | Sony Corporation | Map information display device, map information display method and program |
US7990455B2 (en) | 2002-09-27 | 2011-08-02 | Fujifilm Corporation | Image information management system |
US8245156B2 (en) * | 2008-06-28 | 2012-08-14 | Apple Inc. | Radial menu selection |
US8316324B2 (en) * | 2006-09-05 | 2012-11-20 | Navisense | Method and apparatus for touchless control of a device |
GB2451274B (en) * | 2007-07-26 | 2013-03-13 | Displaylink Uk Ltd | A system comprising a touchscreen and one or more conventional display devices |
US8549432B2 (en) * | 2009-05-29 | 2013-10-01 | Apple Inc. | Radial menus |
US8627233B2 (en) * | 2009-03-27 | 2014-01-07 | International Business Machines Corporation | Radial menu with overshoot, fade away, and undo capabilities |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02242323A (en) * | 1989-03-15 | 1990-09-26 | Matsushita Electric Ind Co Ltd | Method and device for selecting pop-up menu |
US5689667A (en) * | 1995-06-06 | 1997-11-18 | Silicon Graphics, Inc. | Methods and system of controlling menus with radial and linear portions |
JPH09204426A (en) * | 1996-01-25 | 1997-08-05 | Sharp Corp | Method for editing data |
JPH10198517A (en) * | 1997-01-10 | 1998-07-31 | Tokyo Noukou Univ | Method for controlling display content of display device |
JP3744116B2 (en) * | 1997-04-08 | 2006-02-08 | 松下電器産業株式会社 | Display input device |
JPH11136568A (en) * | 1997-10-31 | 1999-05-21 | Fuji Photo Film Co Ltd | Touch panel operation-type camera |
JP2000267808A (en) * | 1999-03-16 | 2000-09-29 | Oki Electric Ind Co Ltd | Input method linking touch panel input device with display device |
JP3358583B2 (en) * | 1999-03-30 | 2002-12-24 | 松下電器産業株式会社 | Car navigation device and its selection screen display method |
JP2001265523A (en) * | 2000-03-21 | 2001-09-28 | Sony Corp | Information input/output system, information input/ output method and program storage medium |
US7246329B1 (en) * | 2001-05-18 | 2007-07-17 | Autodesk, Inc. | Multiple menus for use with a graphical user interface |
JP2004356819A (en) * | 2003-05-28 | 2004-12-16 | Sharp Corp | Remote control apparatus |
JP2006139615A (en) * | 2004-11-12 | 2006-06-01 | Access Co Ltd | Display device, menu display program, and tab display program |
JP4738019B2 (en) * | 2005-02-23 | 2011-08-03 | 任天堂株式会社 | GAME PROGRAM, GAME DEVICE, GAME CONTROL METHOD, AND GAME SYSTEM |
US20070097096A1 (en) * | 2006-03-25 | 2007-05-03 | Outland Research, Llc | Bimodal user interface paradigm for touch screen devices |
US7552402B2 (en) * | 2006-06-22 | 2009-06-23 | Microsoft Corporation | Interface orientation using shadows |
KR100774927B1 (en) * | 2006-09-27 | 2007-11-09 | 엘지전자 주식회사 | Mobile terminal, menu and item selection method |
US8284168B2 (en) * | 2006-12-22 | 2012-10-09 | Panasonic Corporation | User interface device |
KR101169311B1 (en) * | 2007-04-26 | 2012-08-03 | 노키아 코포레이션 | Method, device, module, apparatus, and computer program for an input interface |
CN101689244B (en) * | 2007-05-04 | 2015-07-22 | 高通股份有限公司 | Camera-based user input for compact devices |
KR100837283B1 (en) * | 2007-09-10 | 2008-06-11 | (주)익스트라스탠다드 | Handheld terminal with touch screen |
US20090101415A1 (en) * | 2007-10-19 | 2009-04-23 | Nokia Corporation | Apparatus, method, computer program and user interface for enabling user input |
KR20090047828A (en) * | 2007-11-08 | 2009-05-13 | 삼성전자주식회사 | Content display method and electronic device applying the same |
JP5367339B2 (en) * | 2008-10-28 | 2013-12-11 | シャープ株式会社 | MENU DISPLAY DEVICE, MENU DISPLAY DEVICE CONTROL METHOD, AND MENU DISPLAY PROGRAM |
US20100192101A1 (en) * | 2009-01-29 | 2010-07-29 | International Business Machines Corporation | Displaying radial menus in a graphics container |
DE102013007250A1 (en) * | 2013-04-26 | 2014-10-30 | Inodyn Newmedia Gmbh | Procedure for gesture control |
-
2009
- 2009-07-02 JP JP2009158153A patent/JP5402322B2/en not_active Expired - Fee Related
-
2010
- 2010-06-23 US US12/821,399 patent/US8806336B2/en not_active Expired - Fee Related
- 2010-06-23 EP EP20100167047 patent/EP2270642B1/en not_active Not-in-force
-
2014
- 2014-07-07 US US14/324,582 patent/US20140351755A1/en not_active Abandoned
Patent Citations (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5805167A (en) * | 1994-09-22 | 1998-09-08 | Van Cruyningen; Izak | Popup menus with directional gestures |
US6346938B1 (en) | 1999-04-27 | 2002-02-12 | Harris Corporation | Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model |
US7126579B2 (en) | 2000-08-24 | 2006-10-24 | Siemens Aktiengesellschaft | Method for requesting destination information and for navigating in a map view, computer program product and navigation unit |
JP2004037125A (en) | 2002-06-28 | 2004-02-05 | Clarion Co Ltd | System, method and program for presenting peripheral information in navigation |
US7990455B2 (en) | 2002-09-27 | 2011-08-02 | Fujifilm Corporation | Image information management system |
JP2004233333A (en) | 2003-01-06 | 2004-08-19 | Alpine Electronics Inc | Stereoscopic display method for navigation, and navigation device |
JP2005352619A (en) | 2004-06-09 | 2005-12-22 | Fujitsu Ten Ltd | Function selector |
US7376510B1 (en) | 2004-11-05 | 2008-05-20 | Navteq North America, Llc | Map display for a navigation system |
EP1921419A1 (en) | 2005-09-02 | 2008-05-14 | Matsushita Electric Industrial Co., Ltd. | Image display device and image generation device |
US20100077354A1 (en) * | 2006-01-27 | 2010-03-25 | Microsoft Corporation | Area Selectable Menus |
JP2007226571A (en) | 2006-02-23 | 2007-09-06 | Kyocera Mita Corp | Electronic equipment device depending on touch panel input, and program performing input operation of touch panel |
US8316324B2 (en) * | 2006-09-05 | 2012-11-20 | Navisense | Method and apparatus for touchless control of a device |
JP2007080291A (en) | 2006-11-27 | 2007-03-29 | Fujitsu Ltd | Input processing method and input processing apparatus for implementing the same |
US20080294332A1 (en) | 2007-01-17 | 2008-11-27 | 3-D-V-U Israel (2000) Ltd. | Method for Image Based Navigation Route Corridor For 3D View on Mobile Platforms for Mobile Users |
US20080313538A1 (en) * | 2007-06-12 | 2008-12-18 | Microsoft Corporation | Visual Feedback Display |
JP2009025041A (en) | 2007-07-17 | 2009-02-05 | Pioneer Electronic Corp | System and method for navigation |
GB2451274B (en) * | 2007-07-26 | 2013-03-13 | Displaylink Uk Ltd | A system comprising a touchscreen and one or more conventional display devices |
EP2068235A2 (en) | 2007-12-07 | 2009-06-10 | Sony Corporation | Input device, display device, input method, display method, and program |
US20090281720A1 (en) | 2008-05-08 | 2009-11-12 | Gabriel Jakobson | Method and system for displaying navigation information on an electronic map |
US8245156B2 (en) * | 2008-06-28 | 2012-08-14 | Apple Inc. | Radial menu selection |
US20100079373A1 (en) * | 2008-09-26 | 2010-04-01 | Nintendo Co., Ltd. | Storage medium storing image processing program for implementing controlled image display according to input coordinate, and information processing device |
US20100079405A1 (en) * | 2008-09-30 | 2010-04-01 | Jeffrey Traer Bernstein | Touch Screen Device, Method, and Graphical User Interface for Moving On-Screen Objects Without Using a Cursor |
US8627233B2 (en) * | 2009-03-27 | 2014-01-07 | International Business Machines Corporation | Radial menu with overshoot, fade away, and undo capabilities |
US8549432B2 (en) * | 2009-05-29 | 2013-10-01 | Apple Inc. | Radial menus |
US20110001628A1 (en) | 2009-07-03 | 2011-01-06 | Sony Corporation | Map information display device, map information display method and program |
Non-Patent Citations (1)
Title |
---|
Rohs, M., Real-world interaction with camera phones, Nov. 8, 2004, In Ubiquitous Computing Systems. Second International Symposium, UCS 2004. pp. 74-89. * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9251722B2 (en) | 2009-07-03 | 2016-02-02 | Sony Corporation | Map information display device, map information display method and program |
US10755604B2 (en) | 2009-07-03 | 2020-08-25 | Sony Corporation | Map information display device, map information display method and program |
US10474352B1 (en) * | 2011-07-12 | 2019-11-12 | Domo, Inc. | Dynamic expansion of data visualizations |
US10726624B2 (en) | 2011-07-12 | 2020-07-28 | Domo, Inc. | Automatic creation of drill paths |
US20150177973A1 (en) * | 2013-12-19 | 2015-06-25 | Funai Electric Co., Ltd. | Selection device |
US20160188129A1 (en) * | 2014-12-25 | 2016-06-30 | Fu Tai Hua Industry (Shenzhen) Co., Ltd. | Electronic device and method for displaying interface according to detected touch operation |
Also Published As
Publication number | Publication date |
---|---|
JP5402322B2 (en) | 2014-01-29 |
CN101943989A (en) | 2011-01-12 |
EP2270642B1 (en) | 2015-05-13 |
EP2270642A3 (en) | 2013-12-11 |
JP2011013980A (en) | 2011-01-20 |
US20140351755A1 (en) | 2014-11-27 |
EP2270642A2 (en) | 2011-01-05 |
US20110004821A1 (en) | 2011-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8806336B2 (en) | Facilitating display of a menu and selection of a menu item via a touch screen interface | |
US10642432B2 (en) | Information processing apparatus, information processing method, and program | |
EP3232315B1 (en) | Device and method for providing a user interface | |
US7924271B2 (en) | Detecting gestures on multi-event sensitive devices | |
US9696871B2 (en) | Method and portable terminal for moving icon | |
EP2372516B1 (en) | Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display | |
US11269486B2 (en) | Method for displaying item in terminal and terminal using the same | |
US10108331B2 (en) | Method, apparatus and computer readable medium for window management on extending screens | |
US20110001628A1 (en) | Map information display device, map information display method and program | |
US9459704B2 (en) | Method and apparatus for providing one-handed user interface in mobile device having touch screen | |
US20090102809A1 (en) | Coordinate Detecting Device and Operation Method Using a Touch Panel | |
US9519369B2 (en) | Touch screen selection | |
US20100245242A1 (en) | Electronic device and method for operating screen | |
WO2022012664A1 (en) | Background program control method and apparatus, and electronic device | |
EP3336672A1 (en) | Method and apparatus for providing graphic user interface in mobile terminal | |
EP2474896A2 (en) | Information processing apparatus, information processing method, and computer program | |
KR20140017429A (en) | Method of screen operation and an electronic device therof | |
JP6123879B2 (en) | Display device, display method, program thereof, and terminal device | |
WO2016183912A1 (en) | Menu layout arrangement method and apparatus | |
US20140085340A1 (en) | Method and electronic device for manipulating scale or rotation of graphic on display | |
CN101943989B (en) | Information processor and information processing method | |
KR20150101703A (en) | Display apparatus and method for processing gesture input |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIYAZAWA, YUSUKE;HOMMA, FUMINORI;NARITA, TOMOYA;AND OTHERS;REEL/FRAME:024787/0582 Effective date: 20100716 |
|
AS | Assignment |
Owner name: NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF Free format text: CONFIRMATORY LICENSE;ASSIGNOR:UNIVERSITY OF PENNSYLVANIA;REEL/FRAME:025156/0320 Effective date: 20101013 |
|
XAS | Not any more in us assignment database |
Free format text: CONFIRMATORY LICENSE;ASSIGNOR:UNIVERSITY OF PENNSYLVANIA;REEL/FRAME:025156/0320 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: CORRECTION BY AFFIDAVIT ERRONEOUSLY FILED PATENT APPLICATION NO. PREVIOUSLY RECORDED AT REEL: 025156 FRAME: 0320. ASSIGNOR(S) HEREBY CONFIRMS CONFIRMATORY LICENSE;ASSIGNOR:SONY CORPORATION;REEL/FRAME:033763/0361 Effective date: 20140613 |
|
AS | Assignment |
Owner name: NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF Free format text: CONFIRMATORY LICENSE;ASSIGNOR:UNIVERSITY OF PENNSYLVANIA;REEL/FRAME:036988/0121 Effective date: 20101013 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20220812 |