US20140210791A1 - Determining Touch Locations and Forces Thereto on a Touch and Force Sensing Surface - Google Patents
Determining Touch Locations and Forces Thereto on a Touch and Force Sensing Surface Download PDFInfo
- Publication number
- US20140210791A1 US20140210791A1 US14/254,098 US201414254098A US2014210791A1 US 20140210791 A1 US20140210791 A1 US 20140210791A1 US 201414254098 A US201414254098 A US 201414254098A US 2014210791 A1 US2014210791 A1 US 2014210791A1
- Authority
- US
- United States
- Prior art keywords
- touch
- node
- value
- determining
- mutual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000008859 change Effects 0.000 claims abstract description 22
- 238000000034 method Methods 0.000 claims description 148
- 238000001514 detection method Methods 0.000 claims description 44
- 238000005259 measurement Methods 0.000 claims description 44
- 230000033001 locomotion Effects 0.000 claims description 23
- 230000000007 visual effect Effects 0.000 claims description 16
- 239000000758 substrate Substances 0.000 claims description 15
- 239000002184 metal Substances 0.000 claims description 8
- 229910052751 metal Inorganic materials 0.000 claims description 8
- 125000006850 spacer group Chemical group 0.000 claims description 8
- SKRWFPLZQAAQSU-UHFFFAOYSA-N stibanylidynetin;hydrate Chemical compound O.[Sn].[Sb] SKRWFPLZQAAQSU-UHFFFAOYSA-N 0.000 claims description 8
- 238000000576 coating method Methods 0.000 claims description 6
- 239000011248 coating agent Substances 0.000 claims description 5
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 claims description 3
- 239000012799 electrically-conductive coating Substances 0.000 claims description 2
- 229910052755 nonmetal Inorganic materials 0.000 claims description 2
- 230000008569 process Effects 0.000 description 68
- 238000010586 diagram Methods 0.000 description 18
- 238000012545 processing Methods 0.000 description 13
- 230000005284 excitation Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 210000000707 wrist Anatomy 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 239000003990 capacitor Substances 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 238000005229 chemical vapour deposition Methods 0.000 description 2
- 229910003460 diamond Inorganic materials 0.000 description 2
- 239000010432 diamond Substances 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 239000007769 metal material Substances 0.000 description 1
- 239000010813 municipal solid waste Substances 0.000 description 1
- 238000013386 optimize process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0446—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
- G06F3/041662—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving using alternate mutual and self-capacitive scanning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
- G06F3/04186—Touch location disambiguation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0445—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using two or more layers of sensing electrodes, e.g. using two layers of electrodes separated by a dielectric layer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
Definitions
- the present disclosure relates to capacitive touch sensing, and more particularly, to touch sensing that determines both touch locations and pressure (force) applied at the touch locations.
- Human interface devices include touch control systems that are based on touch sensing surfaces, e.g., pads, screens, etc., using capacitive sensors that change capacitance values when touched. Transforming the touch(es) on the touch sensor into one or more touch locations is non-trivial. Tracking one or more touches on the touch sensor is also challenging. Advanced touch control systems are capable of detecting not only a single touch and/or movement on a touch sensing surface such as a touch screen, but also so-called multi-touch scenarios in which a user touches more than one location and/or moves more than one finger over the respective touch sensing surface, e.g., gesturing.
- Present technology touch sensors generally can only determine a location of a touch thereto, but not a force value of the touch to the touch sensing surface. Being able to determine not only the X-Y coordinate location of a touch but also the force of that touch gives another control option that may be used with a device having a touch sensing surface with such force sensing feature.
- a method for decoding multiple touches and forces thereof on a touch sensing surface may comprise the steps of: scanning a plurality of channels aligned on an axis for determining self capacitance values of each of the plurality of channels; comparing the self capacitance values to determine which one of the channels has a local maximum self capacitance value; scanning a plurality of nodes of the at least one channel having the local maximum self capacitance value for determining mutual values of the nodes; comparing the mutual values to determine which one of the nodes has the largest mutual capacitance value, wherein the node having the largest mutual capacitance value on the local maximum self capacitance value channel may be a potential touch location; and determining a force at the potential touch location from a change in the mutual capacitance values of the node at the potential touch location during no touch and during a touch thereto.
- the method may comprise the steps of: determining if at least one of the self values may be greater than a self touch threshold, wherein if yes then continue to the step of scanning a plurality of nodes of the at least one channel having the largest self value, and if no then end a touch detection frame as completed.
- the method may comprise the steps of: determining left and right slope values for the at least one self value, wherein: the left slope value may be equal to the at least one self value minus a self value of a channel to the left of the at least one channel, and the right slope value may be equal to the at least one self value minus a self value of a channel to the right of the at least one channel.
- the method may comprise the steps of: determining if the left slope value may be greater than zero (0) and the right slope value may be less than zero (0), wherein if yes then return to the step of scanning the plurality of nodes of the at least one channel, and if no then continue to next step; determining if the left slope value may be greater than zero (0) and greater than the right slope value, wherein if yes then return to the step of scanning the plurality of nodes of the at least one channel, and if no then continue to next step; determining if the left slope value may be less than zero (0) and greater than a percentage of the right slope value, wherein if yes then return to the step of scanning the plurality of nodes of the at least one channel, and if no then continue to next step; determining if there may be another self value, wherein if yes then return to the step of determining if at least one of the self values may be greater than the self touch threshold value using the another self value, and if no then end a touch
- the method may comprise the steps of: determining if at least one of the mutual values may be greater than a mutual touch threshold, wherein if yes then continue to the step of scanning a plurality of nodes of the at least one channel having the largest self value, and if no then end the touch detection frame as completed.
- the method may comprise the steps of: determining a next slope value, wherein the next slope value may be equal to a current mutual value minus a next mutual value of a next node; and determining a previous slope value, wherein the previous slope value may be equal to the current mutual value minus a previous mutual value of a previous node.
- the method may comprise the steps of: determining if the next slope value may be less than zero (0) and the previous slope value may be greater than zero (0), wherein if yes then begin the step of validating the node, and if no then continue to next step; determining if the next slope value may be greater than zero (0) and less than a percentage of the previous slope value, wherein if yes then begin the step of validating the node, and if no then continue to next step; determining if the next slope value may be less than zero (0) and greater than the previous slope value, wherein if yes then begin the step of validating the node, and if no then continue to next step; determining if there may be another mutual value, wherein if yes then return to the step of determining if at least one of the mutual values may be greater than the mutual touch threshold, and if no then continue to the next step; and determining if there may be another self value, wherein if yes then examine another self value and return to the step of determining
- the step of validating the node may comprise the steps of: identifying the node having a local maximum mutual value as a current node; determining if there may be a valid node north of the current node, wherein if no then continue to the step of determining if there may be a valid node south of the current node, and if yes then perform a mutual measurement on the north node and continue to the next step; determining if the north node may be greater then the current node, if yes then make the north node the current node and continue to the step of determining whether a touch point already exists at this node, and if no then continue to the next step; determining if there may be a valid node south of the current node, wherein if no then continue to the step of determining if there may be a valid node east of the current node, and if yes then perform a mutual measurement on the south node and continue to the next step; determining if the south node may
- a system for determining gesturing motions and forces thereof on a touch sensing surface having a visual display may comprise: a first plurality of electrodes arranged in a parallel orientation having a first axis, wherein each of the first plurality of electrodes may comprise a self capacitance; a second plurality of electrodes arranged in a parallel orientation having a second axis substantially perpendicular to the first axis, the first plurality of electrodes may be located over the second plurality of electrodes and form a plurality of nodes may comprise overlapping intersections of the first and second plurality of electrodes, wherein each of the plurality of nodes may comprise a mutual capacitance; a flexible electrically conductive cover over the first plurality of electrodes, wherein a face of the flexible electrically conductive cover forms the touch sensing surface; a plurality of deformable spacers between the flexible electrically conductive cover and the first plurality of electrodes, wherein the plurality of deformable spacers maintains a distance
- the digital processor, memory, analog front end and ADC may be provided by a digital device.
- the digital device may comprise a microcontroller.
- the flexible electrically conductive cover may comprise a flexible metal substrate.
- the flexible electrically conductive cover may comprise a flexible non-metal substrate and an electrically conductive coating on a surface thereof.
- the flexible electrically conductive cover may comprise a substantially light transmissive flexible substrate and a coating of Indium Tin Oxide (ITO) on a surface of the flexible substrate.
- the flexible electrically conductive cover may comprise a substantially light transmissive flexible substrate and a coating of Antimony Tin Oxide (ATO) on a surface of the flexible substrate.
- ITO Indium Tin Oxide
- ATO Antimony Tin Oxide
- the method for determining the gesturing motion and the at least one force associated therewith may comprise the step of selecting an object shown in the visual display by touching the object with a first force.
- the method may comprise the step of locking the object in place by touching the object with a second force.
- the method may comprise the step of releasing the lock on the object by touching the object with a third force and moving the touch in a direction across the touch sensing surface.
- the method may comprise the step of releasing the lock on the object by removing the touch at a first force to the object and then touching the object again at a second force.
- the second force may be greater than the first force.
- a method for determining the gesturing motion and the at least one force associated therewith may comprise the steps of: touching a right portion of an object shown in the visual display with a first force; touching a left portion of the object with a second force; wherein when the first force may be greater than the second force the object rotates in a first direction, and when the second force may be greater than the first force the object rotates in a second direction.
- the first direction may be clockwise and the second direction may be counter-clockwise.
- the first and second directions may be substantially perpendicular to the third and fourth directions.
- the method for determining the gesturing motion and the at least one force associated therewith may comprise the step of: changing a size of an object shown in the visual display by touching a portion of the object with a force, wherein the greater the force the large the size of the object becomes.
- the size of the object may be fixed when the touch and the force may be moved off of the object.
- the size of the object varies in proportion to the amount of force applied to the object.
- the method for determining the gesturing motion and the at least one force associated therewith may comprise the step of: handling pages of a document shown in the visual display by touching a portion of the document with a force sufficient to flip through the pages.
- the step of removing a currently visible page may further comprise the step of moving the touch at the currently visible page in a first direction parallel with the touch sensing surface.
- the step of inserting the removed page into a new document may comprise the step of touching the removed page with the force near the new document.
- the method for determining the gesturing motion and the at least one force associated therewith may comprise the step of changing values of an alpha-numeric character shown in the visual display by touching the alpha-numeric character with different forces, wherein a first force will cause the alpha-numeric character to increment and a second force will cause the alpha-numeric character to decrement.
- the value of the alpha-numeric character may be locked when the touch may be moved off of the alpha-numeric character and parallel to the touch sensing surface.
- the method for determining the gesturing motion and the at least one force associated therewith may comprise the steps of: incrementing a value of an alpha-numeric character shown in the visual display by touching an upper portion of the alpha-numeric character with a force; and decrementing the value of the alpha-numeric character by touching an lower portion of the alpha-numeric character with the force.
- the value of the alpha-numeric character may be locked when the touch may be moved off of the alpha-numeric character and parallel to the touch sensing surface.
- a speed of incrementing or decrementing the value of the alpha-numeric character may be proportional to a magnitude of the force applied to upper portion or lower portion, respectively, of the alpha-numeric character.
- the alpha-numeric character may be a number.
- the alpha-numeric character may be a letter of an alphabet.
- FIG. 1 illustrates a schematic block diagram of an electronic system having a capacitive touch sensor, a capacitive touch analog front end and a digital processor, according to the teachings of this disclosure
- FIG. 2 illustrates schematic elevational views of metal over capacitive touch sensors, according to the teachings of this disclosure
- FIG. 3 illustrates a schematic elevational view of a touch sensor capable of detecting both locations of touches thereto and forces of those touches, according to the teachings of this disclosure
- FIGS. 4A to 4D illustrate schematic plan views of touch sensors having various capacitive touch sensor configurations, according to the teachings of this disclosure
- FIGS. 4E and 4F illustrate schematic plan views of self and mutual capacitive touch detection of a single touch to a touch sensor, according to the teachings of this disclosure
- FIGS. 4G to 4K illustrate schematic plan views of self and mutual capacitive touch detection of two touches to a touch sensor, according to the teachings of this disclosure
- FIG. 5 illustrates a schematic process flow diagram for multi-touch and force decoding of a touch sensor as shown in FIG. 1 , according to specific example embodiments of this disclosure
- FIG. 6 illustrates a graph of single touch peak detection data, according to specific example embodiments of this disclosure.
- FIG. 7 illustrates a schematic plan diagram of potential touch and mutual touch locations of a touch sensor, according to specific example embodiments of this disclosure.
- FIG. 8 illustrates a schematic plan view diagram of a touch sensor showing a cache data window thereof, according to specific example embodiments of this disclosure
- FIG. 9 illustrates a graph of self scan values and a table of mutual scan values for two touch peak detection data, according to specific example embodiments of this disclosure.
- FIGS. 10 and 11 illustrate schematic diagrams of historic and current point locations used for a point weighting example, according to the teachings of this disclosure
- FIG. 12 illustrates schematic drawings of a normal finger touch and a flat finger touch, according to the teachings of this disclosure
- FIGS. 13 to 23 illustrate schematic process flow diagrams for touch decoding and force determination of the decoded touch(es), according to specific example embodiments of this disclosure
- FIG. 24 illustrates a schematic plan view of a finger of a hand touching a surface of a touch sensor, according to a specific example embodiment of this disclosure
- FIG. 25 illustrates a schematic plan view of two fingers of a hand touching a surface of a touch sensor, according to another specific example embodiment of this disclosure
- FIG. 26 illustrates a schematic plan view of a finger of a hand touching an object projected on a surface of a touch sensor, according to yet another specific example embodiment of this disclosure
- FIG. 27 illustrates a schematic plan view of a finger of a hand touching a document projected on a surface of a touch sensor, according to still another specific example embodiment of this disclosure.
- FIG. 28 illustrates a schematic plan view of a finger of a hand touching one digit of a number projected on a surface of a touch sensor, according to another specific example embodiment of this disclosure.
- a series of optimized processes may be provided that scan a plurality of (electrically) conductive columns and rows arranged in a matrix on a surface, e.g., touch sensor display or panel, and which identify and track a plurality of touches thereto and forces thereof.
- These processes may be further optimized for operation with a low cost 8-bit microcontroller, according to specific embodiments of this disclosure.
- a force thereof may be assigned to the touch based upon the magnitude of change of the capacitance values determined during scans of a touch sensor, as more fully described hereinabove.
- the touch forces applied to the touch sensor from the associated tracked touch points may be utilized in further determining three dimensional gesturing, e.g., X, Y and Z positions and forces, respectively.
- proportional force at a touch location(s) allows three dimensional control of an object projected onto a screen of the touch sensor. Differing pressures on multiple points, e.g., during more then one touch (multiple fingers touching face of touch sensor), allows object rotation control.
- a touch at a certain force may allow selecting an object(s) and a touch at a difference, e.g., greater force, may be used to fix the location(s) of the object(s) on the display of the touch sensor.
- Rocking multi-touch presses to produce varying touch forces may be used for rotation of an object.
- a vertical motion, e.g., vertical sliding, press may be used to scale a vertical size of an object.
- a horizontal motion, e.g., horizontal sliding, press may be used to scale a horizontal size of an object.
- Touches with varying force may be used to flip through pages of a document.
- a varying force may be used to insert a page into a stack of pages of a document.
- a vertical or horizontal gesture and force may be used to activate a function, e.g., empty trash bin icon. Varying touch pressure may be used to lift a page off of a document for transmission to another display.
- Varying touch pressure may change the scope of a gesture movement, e.g., selecting a picture instead of the full document. Pressing with a sweeping gesture may be used for an object release and discard. Varying touch pressures may be used to select alpha-numeric characters or drop function boxes.
- these processes utilize both self and mutual scans to perform an optimized scan of the plurality of conductive columns and rows used for touch sensing.
- the proposed processes may use a subset of the data from the plurality of conductive columns and rows in order to do all necessary processing for touch location identification and tracking.
- the various embodiments specifically focus on a low-resource requirement solution for achieving touch location identification and tracking.
- self capacitances of either the conductive columns or rows may be measured first then mutual capacitances of only those conductive columns or rows may be measured in combination with the other axis of conductive rows or columns.
- the various embodiments disclosed herein overcome the problem of transforming these self and mutual capacitance measurements into one or more touches and forces thereof, and tracking these one or more touches and forces thereof through multiple frames of the capacitance measurements of the conductive columns or rows as described hereinabove.
- At least one process may scan a plurality of conductive columns and rows arranged in a matrix, detect and track up to N touches, using various unique techniques disclosed and claimed herein.
- a process of peak detection examines slope ratios to accurately and quickly determine peak measurements.
- the challenge of tracking multiple touch locations may be solved through time on associated ones of the plurality of conductive columns or rows.
- the various embodiments may allow for N touches to compensate for touches of different finger positions, e.g., such as a flat finger, that prevents missed touches and substantially eliminates incorrect touches.
- a process for quickly identifying accurate touches instead of only looking at true peaks, wherein a “virtual” peak may be found by examining slope ratios using various techniques disclosed herein for touch identification.
- a combination of unique processes may be used to achieve better accuracy and speed improvements for multi-touch decoding.
- a peak detection process may be implemented as a “fuzzy” peak detection process that examines slope relationships, not just signs of the slopes between the conductive columns measured.
- a so-called “nudge technique” may be used that “nudges” a potential touch location to a best location by examining adjacent values thereto.
- Windowed data cache may be used to accelerate processing in a low capacity RAM environment, e.g., 8-bit microcontroller. Interpolation may be used to increase the touch location resolution based upon measured values adjacent thereto. Multi-touch tracking may be used to identify N touches through time. Multi-touch tracking may be used to track N touches through time. Weighted matching may be used in a weighting method to best match touch points over time. “Area” detection may use a process that allows easy area and/or pressure detection based upon the sum of the nudged values for a given touch location.
- Various embodiments may track eight or more touches and forces thereof on, for example but not limited to, a 3.5 inch touch sensor capacitive sensor array. For example when using a Microchip PIC18F46K22 (64K ROM, ⁇ 4K RAM) microcontroller.
- a digital device 112 may comprise a digital processor and memory 106 , an analog-to-digital converter (ADC) controller 108 , and a capacitive touch analog front end (AFE) 110 .
- the digital device 112 may be coupled to a touch sensor 102 comprised of a plurality of conductive columns 104 and rows 105 arranged in a matrix and having a flexible electrically conductive cover 103 thereover.
- the conductive rows 105 and/or conductive columns 104 may be, for example but are not limited to, printed circuit board conductors, wires, Indium Tin Oxide (ITO) or Antimony Tin Oxide (ATO) coatings on a clear substrate, e.g., display/touch screen, etc., or any combinations thereof.
- the flexible electrically conductive cover 103 may comprise metal, conductive non-metallic material, ITO or ATO coating on a flexible clear substrate (plastic), etc.
- the digital device 112 may comprise a microcontroller, microprocessor, digital signal processor, application specific integrated circuit (ASIC), programmable logic array (PLA), etc., and may further comprise one or more integrated circuits (not shown), packaged or unpackaged.
- a capacitive sensor 238 is on a substrate 232 .
- an electrically conductive flexible cover 103 e.g., metal, ITO or ATO coated plastic, etc.; is located on top of the spacers 234 and forms a chamber 236 over the capacitive sensor 238 .
- a force 242 is applied to a location on the flexible cover 103 , the flexible cover 103 moves toward the capacitive sensor 238 , thereby increasing the capacitance thereof.
- the capacitance value(s) of the capacitive sensor(s) 238 is measured and an increase in capacitance value thereof will indicate the location of the force 242 (e.g., touch).
- the capacitance value of the capacitive sensor 238 will increase the closer the flexible cover 103 moves toward the face of the capacitive sensor 238 .
- Metal over capacitive touch technology is more fully described in Application Note AN1325, entitled “mTouchTM Metal over Cap Technology” by Keith Curtis and Dieter Peter, available www.microchip.com; and is hereby incorporated by reference herein for all purposes.
- a touch sensor capable of detecting both a location of a touch(es) thereto and a force(s) of that touch(es) thereto may comprise a plurality of conductive rows 105 , a plurality of conductive columns 104 , a plurality of deformable spacers 334 , and a flexible electrically conductive cover 103 .
- the conductive columns 104 and the conductive rows 105 may be used in determining a location(s) of a touch(es), more fully described in Technical Bulletin TB3064, entitled “mTouchTM Projected Capacitive Touch Screen Sensing Theory of Operation” referenced hereinabove, and the magnitude of changes in the capacitance values of the conductive column(s) 104 at and around the touch location(s) may be used in determining the force 242 (amount of pressure applied at the touch location).
- the plurality of deformable spacers 334 may be used to maintain a constant spacing between the flexible conductive cover 103 and a front surface of the conductive columns 104 when no force 242 is being applied to the flexible electrically conductive cover 103 .
- the flexible electrically conductive cover 103 When force 242 is applied to a location on the flexible electrically conductive cover 103 , the flexible electrically conductive cover 103 will be biased toward at least one conductive column 104 , thereby increasing the capacitance thereof. Direct measurements of capacitance values and/or ratios of the capacitance values may be used in determining the magnitude of the force 242 being applied at the touch location(s).
- digital devices 112 e.g., microcontrollers
- peripherals that enhance the detection and evaluation of such capacitive value changes. More detailed descriptions of various capacitive touch system applications are more fully disclosed in Microchip Technology Incorporated application notes AN1298, AN1325 and AN1334, available at www.microchip.com, and all are hereby incorporated by reference herein for all purposes.
- CVD capacitive voltage divider
- a Charge Time Measurement Unit may be used for very accurate capacitance measurements.
- the CTMU is more fully described in Microchip application notes AN1250 and AN1375, available at www.microchip.com, and commonly owned U.S. Pat. No. 7,460,441 B2, entitled “Measuring a long time period;” and U.S. Pat. No. 7,764,213 B2, entitled “Current-time digital-to-analog converter,” both by James E. Bartling; wherein all of which are hereby incorporated by reference herein for all purposes.
- any type of capacitance measurement circuit having the necessary resolution may be used in determining the capacitance values of the plurality of conductive columns 104 and nodes (intersections of columns 104 and rows 105 ), and that a person having ordinary skill in the art of electronics and having the benefit of this disclosure could implement such a capacitance measurement circuit.
- FIGS. 4A to 4D depicted are schematic plan views of touch sensors having various capacitive touch sensor configurations, according to the teachings of this disclosure.
- FIG. 4A shows conductive columns 104 and conductive rows 105 .
- Each of the conductive columns 104 has a “self capacitance” that may be individually measured when in a quiescent state, or all of the conductive rows 105 may be actively excited while each one of the conductive columns 104 has self capacitance measurements made thereof. Active excitation of all of the conductive rows 105 may provide a stronger measurement signal for individual capacitive measurements of the conductive columns 104 .
- the self capacitance scan may only determine which one of the conductive columns 104 has been touched, but not at what location along the axis of that conductive column 104 where it was touched.
- the mutual capacitance scan may determine the touch location along the axis of that conductive column 104 by individually exciting (driving) one at a time the conductive rows 105 and measuring a mutual capacitance value for each one of the locations on that conductive column 104 that intersects (crosses over) the conductive rows 105 .
- insulating non-conductive dielectric between and separating the conductive columns 104 and the conductive rows 105 . Where the conductive columns 104 intersect with (crossover) the conductive rows 105 , mutual capacitors 120 are thereby formed. During the self capacitance scan above, all of the conductive rows 105 may be either grounded or driven with a logic signal, thereby forming individual column capacitors associated with each one of the conductive columns 104 .
- FIGS. 4B and 4C show interleaving of diamond shaped patterns of the conductive columns 104 and the conductive rows 105 .
- This configuration may maximize exposure of each axis conductive column and/or row to a touch (e.g., better sensitivity) with a smaller overlap between the conductive columns 104 and the conductive rows 105 .
- FIG. 4D shows receiver (top) conductive rows (e.g., electrodes) 105 a and transmitter (bottom) conductive columns 104 a comprising comb like meshing fingers.
- the conductive columns 104 a and conductive rows 105 a are shown in a side-by-side plan view, but normally the top conductive rows 105 a would be over the bottom conductive columns 104 a .
- FIGS. 4E and 4F depicted are schematic plan views of self and mutual capacitive touch detection of a single touch to a touch sensor, according to the teachings of this disclosure.
- a touch represented by a picture of a part of a finger, is at approximately the coordinates of X05, Y07.
- each one of the rows Y01 to Y09 may be measured to the determine the capacitance values thereof.
- baseline capacitance values with no touches thereto for each one of the rows Y01 to Y09 have been taken and stored in a memory (e.g., memory 106 — FIG. 1 ).
- mutual capacitive detection may be used in determining where on the touched row (Y07) the touch has occurred. This may be accomplished by exciting, e.g., putting a voltage pulse on, each of the columns X01 to X12 one at a time while measuring the capacitance value of row Y07 when each of the columns X01 to X12 is individually excited.
- the column (X05) excitation that causes the largest change in the capacitance value of row Y07 will be the location on that row which corresponds to the intersection of column X05 with row Y07, thus the single touch is at point or node X05, Y07.
- the self capacitances of the columns X01 to X21 may be determined first then mutual capacitances determined of a selected column(s) by exciting each row Y01 to Y09 to find the touch location on the selected column(s).
- FIGS. 4G to 4K depicted are schematic plan views of self and mutual capacitive touch detection of two touches to a touch sensor, according to the teachings of this disclosure.
- two touches represented by a picture of parts of two fingers, are at approximately the coordinates of X05, Y07 for touch #1 and X02, Y03 for touch #2.
- each one of the rows Y01 to Y09 may be measured to the determine the capacitance values thereof.
- baseline capacitance values with no touches thereto for each one of the rows Y01 to Y09 have been taken and stored in a memory (e.g., memory 106 — FIG. 1 ).
- any significant capacitance changes to the baseline capacitance values of the rows Y01 to Y09 will be obvious and taken as finger touches.
- the first finger is touching row Y07 and the second finger is touching row Y03, wherein the capacitance values of those two rows will change, indicating touches thereto.
- mutual capacitive detection may be used in determining where on these two touched rows (Y07 and Y03) the touches have occurred. Referring to FIG. 4I , this may be accomplished by exciting, e.g., putting a voltage pulse on, each of the columns X01 to X12 one at a time while measuring the capacitance value of row Y07 when each of the columns X01 to X12 is individually excited.
- the column (X05) excitation that causes the largest change in the capacitance value of row Y07 will be the location on that row that corresponds to the intersection of column X05 with row Y07. Referring to FIG.
- the two touches are at points or nodes (X05, Y07) and (X02, Y03). It is contemplated and within the scope of this disclosure that if the capacitances of more then one of the selected rows, e.g., Y07 and Y03, can be measured simultaneously, then only one set of individual column X01 to X12 excitations is needed in determining the two touches to the touch sensor 102 .
- a process of multi-touch decoding may comprise the steps of Data Acquisition 502 , Touch Identification 504 , Force Identification 505 , Touch and Force Tracking 506 , and Data Output 508 .
- the step of Touch Identification 504 may further comprise the steps of Peak Detection 510 , Nudge 512 and Interpolation 514 , more fully described hereinafter.
- Data Acquisition 502 is the process of taking self capacitance measurements of the plurality of conductive columns 104 or conductive rows 105 , and then mutual capacitance measurements of selected ones of the plurality of conductive columns 104 or conductive rows 105 , and intersections of the plurality of conductive rows 105 or conductive columns 104 , respectively therewith, to acquire touch identification data.
- the touch identification data may be further processed to locate potential touches and forces thereto on the touch sensor 102 using the process of Touch Identification 504 and Force Identification 505 , respectively, as more fully described hereinafter.
- Touch Identification 504 is the process of using the touch identification data acquired during the process of Data Acquisition 502 to locate potential touches on the touch sensor 102 .
- the following are a sequence of process steps to determine which ones of the plurality of conductive columns 104 or conductive rows 105 to select that have a touch(es) thereto using self capacitance measurements thereof, and where on the selected conductive columns 104 or conductive rows 105 the touch(es) may have occurred using mutual capacitance measurements thereof.
- Peak detection 510 is the process of identifying where potential touch locations may be on the touch sensor 102 .
- peak detection may purposely be made “fuzzy,” e.g., identifying potential peaks by looking for ratios of differences of slope values as well as slope “signs,” not just a low-high-low value sequence.
- a “virtual” peak may be detected by examining slope ratios, e.g., 2:1 slope ratio, wherein a change in slope may be identified as a potential peak. This may be repeated until no additional peaks are detected.
- Nudge 512 is the process of examining each adjacent location of a potential touch location once it has been identified. If the adjacent location(s) has a greater value than the existing touch potential location then eliminate the current potential touch location and identify the adjacent location having the greater value as the potential touch location (see FIG. 8 and the description thereof hereinafter).
- Interpolation 514 is the process that examines the adjacent values to generate a higher resolution location.
- Force Identification 505 is the process of using some of the touch identification data acquired during the process of Data Acquisition 502 in combination with the potential touch locations identified during the process of Touch Identification 504 .
- the mutual capacitance measurements associated with the potential touch locations, determined during the process of Touch Identification 504 may be compared with reference capacitance values of those same locations with no touches applied thereto (smaller capacitance values). The magnitude of a capacitance change may thereby be used in determining the force applied by the associated potential touch previously determined.
- Touch and Force Tracking 506 is the process of comparing time sequential “frames” of touch identification data and then determining which touches are associated between sequential frames.
- a combination of weighting and “best guess” matching may be used to track touches and forces thereof through multiple frames during the process of Data Acquisition 502 described hereinabove. This is repeated for every peak detected and every touch that was identified on the previous frame.
- a “frame” is the set of self and mutual capacitive measurements of the plurality of conductive columns 104 or conductive rows 105 in order to capture a single set of touches at a specific time. Each full set of scans (a “frame”) of the self and mutual capacitance measurements of the plurality of conductive columns 104 or conductive rows 105 to acquire touch identification data of the touch sensor 102 at a given time associated with that frame.
- Touch and Force Tracking 506 associates a given touch in one frame with a given touch in a subsequent frame.
- Touch and Force tracking may create a history of touch frames, and may associate the touch locations of a current frame with the touch locations of a previous frame or frames.
- a “weighting” function may be used.
- the weight values (“weight” and “weight values” will be used interchangeably herein) between time sequential touch locations (of different frames) represent the likelihood that time sequential touch locations (of different frames) are associated with each other.
- Distance calculations may be used to assign weight values between these associated touch locations.
- a “true” but complex and processor intensive calculation for determining weight value between touch locations is:
- Weight value SQRT[( X previous ⁇ X current ) 2 +( Y previous ⁇ Y current ) 2 ] Eq. (1)
- a simplified distance (weight value) calculation may be used that measures ⁇ X and ⁇ Y and then sums them together:
- Weight value′ ABS ( X previous ⁇ X current )+ ABS ( Y previous ⁇ Y current ) Eq. (2)
- Eq. (2) creates a diamond shaped pattern for a given weight value instead of a circular pattern of the more complex weight value calculation, Eq. (1).
- Use of Eq. (2) may be optimized for speed of the weight value calculations in a simple processing system, distance may be calculated based upon the sum of the change of the X-distances and the change in the Y-distances, e.g., Eq. (2) herein above.
- a better weight value may be defined as a smaller distance between sequential touch locations.
- a weight value may be calculated for all touch locations from the previous frame.
- the new touch location is then associated with the previous touch location having the best weight value therebetween. If the previous touch location already has an associated touch location from a previous frame, a secondary second-best weight value for each touch location may be examined. The touch location with the lower-cost second-best weight value may then be shifted to its second best location, and the other touch location may be kept as the best touch location. This process is repeated until all touch locations have been associated with previous frame touch locations, or have been identified as “new touches” having new locations with no touch locations from the previous frame being close to the new touch location(s).
- An alternative to the aforementioned weighting process may be a vector-based process utilizing a vector created from the previous two locations to create the most likely next location.
- This vector-based weighting process may use the same distance calculations as the aforementioned weighting process, running it from multiple points and modifying the weight values based upon from which point the measurement was taken.
- the next “most likely” location of that touch may be predicted.
- Once the extrapolated location has been determined that location may be used as the basis for a weighting value.
- an “acceleration model” may be used to add weighting points along the vector to the extrapolated locations and past the extrapolated locations. These additional points assist in detecting changes in speed of the touch movement, but may not be ideal for determining direction of the touch motion.
- forces thereto may be assigned to these touch locations based upon the magnitude of change of the capacitance values determined during the process of Data Acquisition 502 , as more fully described hereinabove. Also the forces applied to the touch sensor 102 from the associated tracked touch points may be utilized in further determining three dimensional gesturing, e.g., X-Y and Z directions.
- FIGS. 10 and 11 depicted are schematic diagrams of historic and current point locations used for a point weighting example, according to the teachings of this disclosure.
- the best combination of weight values and associated touches may be generated. Certain touch scenarios may cause nearly identical weight values, in which case the second best weight values should be compared and associations appropriately shifted.
- points A and D may be associated first.
- the weight values for B are generated BD is a better match then BC. In this case look at secondary weight values. Is it less costly to shift A to be associated with C or to shift B to be associated with C?
- points A and B are existing points, and points 1 and 2 are “new” points that need to be associated.
- Step 1) Calculate weight values between touch locations:
- FIG. 12 depicted are schematic drawings of a normal finger touch and a flat finger touch, according to the teachings of this disclosure.
- One challenge of identifying a touch is the “flat finger” scenario. This is when the side or flat part of a finger 1020 , rather then the finger tip 1022 , is placed on the touch sensor 102 .
- a flat finger 1020 may generate two or more potential touch locations 1024 and 1026 . It is possible using the teaching of this disclosure to detect a flat finger 1020 by accumulating the sum of the values of all nodes nudged to each peak. If the sum of these values surpasses a threshold then it is likely caused by a flat finger touch. If a flat finger touch is detected then other touches that are near the flat finger peak(s) may be suppressed.
- comparing the forces associated with the two or more potential touch locations 1024 and 1026 may also be used in detecting a flat finger 1020 situation.
- Data Output 508 is the process of providing determined touch location coordinates and associated forces applied thereto in a data packet(s) to a host system for further processing.
- a key threshold of slope ratios may be used to flag additional peaks.
- the threshold value used may be, for example but is not limited to, 2:1; so instances where there is a change of slope greater than 2:1 may be identified as potential peaks. This applies to positive and negative slopes. This would be the point circled in column 6 of the example data values shown in FIG. 6 .
- the self scan is only one axis of a two-axis sensor array (e.g., conductive rows 105 and conductive columns 104 of touch sensor 102 , FIG. 1 ), it is possible for two touches that are off by a single “bar” (e.g., column) to only show a single peak. With the example data, there could be two touches, one at 6,6 and another at 7,7 (see FIGS. 6 and 9 ). Without the additional peak detection, the touch at 6,3 may not be detected.
- each adjacent touch location may be examined to determine if they have a greater value. If a greater value is present, eliminate the current potential touch location and identify the touch location of the greater value as a potential touch location. This process is repeated until a local peak has been identified.
- FIG. 6 depicted is a graph of single touch peak detection data, according to specific example embodiments of this disclosure.
- Slope may be determined by subtracting a sequence of adjacent row data values in a column to produce either a positive or negative slope value.
- slope value When the slope value is positive the data values are increasing, and when the slope value is negative the data values are decreasing.
- a true peak may be identified as a transition from a positive to a negative slope as a potential peak.
- a transition from a positive slope to a negative slope is indicated at data value 422 of the graph shown in FIG. 6 .
- a threshold of slope ratios may further be used to flag additional potential peaks.
- Slope is the difference between two data values of adjacent conductive columns 104 .
- This threshold of slope ratios may be, for example but is not limited to, 2:1 so instances where there is a change of slope greater than 2:1 may be identified as another potential peak. This may apply to both positive and negative slopes.
- the data value 420 taken at row 6, has a left slope of 23:1 (30 ⁇ 7) and a right slope of 10:1 (40 ⁇ 30).
- the data value 422 taken at row 7, has a left slope of 10:1 (40 ⁇ 30) and right slope of ⁇ 30:1 (10 ⁇ 40).
- the slope ratio for row 6 of 23:10 exceeds the example 2:1 threshold and would be labeled for further processing. All other data values are below the data value threshold and may be ignored.
- FIG. 7 depicted is a schematic plan diagram of potential touch and mutual touch locations of a touch sensor, according to specific example embodiments of this disclosure.
- each adjacent location thereto may be examined to determine whether any one of them may have a greater data value than the current potential touch location (labeled “C” in FIGS. 7( a ) & 7 ( b )). If a greater data value is found, then the current potential touch location may be eliminated and the touch location having the greater value may be identified as a potential touch location. This is referred to herein as the process of Nudge 512 and may be repeated until a data peak has been identified.
- tier one nodes (labeled “1” in FIGS. 7( a ) and 7 ( b )—adjacent locations to the current potential touch location) are examined. If any of these tier one nodes has a larger data value than the data value of the current potential touch location, a new current touch location is shifted (“nudged”) to that node having the highest data value and the process of Nudge 512 is repeated. If a tier one node is already associated with a different potential peak, then no further searching is necessary and the current data peak may be ignored.
- Tier two nodes (labeled “2” in FIGS. 7( a ) & 7 ( b )—adjacent locations to the tier one nodes) are examined when there is a potential of a large area activation of the touch sensor 102 .
- the process of Nudge 512 may be speeded up by storing the mutual capacitance data values of that one column in a cache memory, then doing the Nudge 512 first on the tier one nodes, and then on the tier two nodes of that one column from the mutual capacitance data values stored in the cache memory. Then only after there are no further nudges to do in that one column will the process of Nudge 512 examine the tier one and tier two nodes from the mutual capacitance measurement scans of the two each adjacent columns on either side of the column having the process of Nudge 512 performed thereon.
- Interpolation of the potential touch location may be performed by using the peak data value node (touch location) as well as each adjacent node thereto (e.g., tier one nodes from a prior Nudge 512 ) to create sub-steps between each node. For example, but not limited to, 128 steps may be created between each node.
- node A is the potential touch location and nodes B, C, D and E are tier one nodes adjacent thereto.
- the interpolated X, Y location may be found using the following equations:
- FIG. 8 depicted is a schematic plan view diagram of a touch sensor showing a cache data window thereof, according to specific example embodiments of this disclosure.
- the conductive columns 104 of the touch sensor 102 may be scanned column by column for self capacitance values until all conductive columns 104 have been scanned.
- Each conductive column 104 indicating a potential touch from the self capacitance data may be sequentially scanned for determining mutual capacitive values thereof (touch data) and when peaks are discovered they may be processed contemporaneously with the column scan.
- touch data may be stored in a cache memory for further processing.
- the Nudge 512 looks at the first tier nodes then the second tier nodes, if necessary, not all of the touch data from all of the conductive columns 104 need be stored at one time. This allows a simple caching system using a minimum amount of random access memory (RAM). For example, storing five columns of touch data in a cache. The five columns are contiguous and a cache window may move across the columns 104 of the touch sensor 102 one column 104 at a time. It is contemplated and within the scope of this disclosure that more or fewer than five columns of touch data may be stored in a cache memory and processed therefrom, and/or self capacitance scanning by rows instead of columns may be used instead. All descriptions herein may be equally applicable to self capacitance scanning of rows then mutual capacitance scanning by columns of those row(s) selected from the self capacitance scan data.
- RAM random access memory
- a Mutual Scan of a first or second tier node (capacitive sensor 104 ) is requested, it may be called first from the cache memory. If the requested node touch data is present in the cache memory, the cache memory returns the requested touch data of that first or second tier node. However, if the requested touch data is not present in the cache memory then the following may occur: 1) If the column of the requested touch data is in the range of the cache window then perform the mutual scan of that column and add the touch data to the cache memory, or 2) If the column of the requested touch data is not in the range of the present cache window then shift the cache window range and perform the mutual scan of the new column and add the resulting touch data from the new cache window to the cache memory.
- FIG. 9 depicted are a graph of self scan values and a table of mutual scan values for two touch peak detection data, according to specific example embodiments of this disclosure. Since a self scan is performed in only one axis (e.g., one column), it is possible for two touches that are off by a single column to only show a single peak. For the example data values shown in FIG. 9 , two touches may have occurred, one at self scan data value 422 and the other indicated at self scan data value 420 . Without being aware of change of slopes greater than 2:1, the potential touch represented by self scan data value 420 may have been missed. A first touch may cause data value 422 and a second touch may cause data value 420 .
- the processes of Peak Detection 510 and Nudge 512 ( FIG. 5 ), as described hereinabove, may further define these multiple touches as described herein. Once each multiple touch has been defined a force thereof may be determined and associated its respective touch.
- a hand of a user may hover over a face of a touch sensor 102 , e.g., touch screen or panel, having a plurality of locations that when at least one of the plurality of locations is touched by a finger 2402 of the hand 2400 , the location and on the face of the touch sensor 102 force thereto is detected and stored for further processing as disclosed herein.
- a light touch of the finger 2402 on the face of the touch sensor 102 may select an object (not shown) displayed by a visual display integral therewith.
- the selected object may be locked in place. Pressing even harder on the locked object and then gesturing to move the object may release the lock on the object.
- pressing on the object selects the object, then pressing harder fixes the object's location. Releasing the pressure (force) on the object then pressing hard on the object again would release the object to move again.
- a finger 2504 over a left portion of the touch sensor 102 and another finger 2506 over a right portion of the touch sensor 102 may be used to rotate an object (not shown) displayed by a visual display integral therewith.
- the object may rotate counterclockwise about an axis parallel with the axis of the wrist/arm.
- the right oriented finger 2506 presses harder than the left oriented finger 2504 the object may rotate clockwise about the axis parallel with the axis of the wrist/arm.
- the object may rotate substantially perpendicular to the axis of the wrist/arm (substantially parallel with the face of the touch sensor 102 ) and in the direction of the rotation of the fingers 2504 and 2506 .
- FIG. 26 depicted is schematic plan view of a finger of a hand touching an object projected on a surface of a touch sensor, according to yet another specific example embodiment of this disclosure.
- Pressing on the face of the touch sensor 102 over an object 2608 with a finger 2402 may be used to scale the size of the object. For example, the greater the force of the press (touch) by the finger 2402 the larger in size that the object may be displayed.
- the object may remain at the new larger size or may vary in size in proportion to the force applied to the face of the touch sensor, e.g., a harder press will result in a larger in size object and a softer press will result in a smaller in size object.
- the size of the object may follow the amount of force applied by the finger 2402 to the face of the touch sensor 102 .
- FIG. 27 depicted is schematic plan view of a finger of a hand touching a document projected on a surface of a touch sensor, according to still another specific example embodiment of this disclosure.
- a document 2710 may displaced on a face of the touch sensor 102 .
- a touch of sufficient force by the finger 2402 to a portion of the document 2710 may be used to flip through pages thereof.
- a finger 2402 movement, for example but not limited to, the right may remove currently visible page(s) of the document 2710 .
- Pressing on a removed page near another new document (not shown) may be used to flip through the new document (not shown) and/or may allow insertion of the remove page into the new document.
- pressing on a document 2710 flips through a stack of document pages. If the finger 2402 then moves off the document the selected page may be removed. Pressing on a single page next to a document may flip through the document and then may insert the page when it is drug over the document.
- FIG. 28 depicted is schematic plan view of a finger of a hand touching one digit of a number projected on a surface of a touch sensor, according to another specific example embodiment of this disclosure.
- At least one number or letter e.g., alpha-numeric character 2814
- a finger 2402 may press on a portion of the character 2814 wherein the amount of force by the finger 2402 may cause the character 2814 to increase or decrease alpha-numerically in value, accordingly.
- the finger 2402 may slide off, e.g., up, down or sideways, to leave editing of the character 2814 .
- An increase in the alpha-numeric value may be controlled by pressing the finger 2402 on an upper portion of the character 2814
- a decrease in the alpha-numeric value may be controlled by pressing the finger 2402 on a lower portion of the character 2814 .
- the speed of increase or decrease of the alpha-numeric value may be proportional to the amount of force applied by the finger 2402 to surface of the touch sensor 102 .
- More than one finger may be used to contemporaneously increase and or decrease more than one alpha-numeric character. For example, a finger 2402 may be pressed on a single digit 2814 of a number (124779 shown), whereby the single digit 2814 sequentially flips through numerical values, e.g., 0-9. When a desired numerical value is displaced, the finger 2402 may be dragged off the digit to leave the selected numerical value.
- FIGS. 13 to 23 depicted are schematic process flow diagrams for touch decoding and force determination of the decoded touch(es), according to specific example embodiments of this disclosure.
- FIG. 13 shows a general overview of possible processes for multi-touch decoding and force determination for a touch sensor 102 enabled device. It is contemplated and within the scope of this disclosure that more, fewer and/or some different processes may be utilized with a touch sensor 102 enabled device and still be within the scope, intent and spirit of this disclosure.
- step 1050 a device is started, actuated, etc., when in step 1052 power is applied to the device.
- step 1054 the device may be initialized, and thereafter in step 1056 the process of Touch Identification 504 may begin.
- step 1057 determines the force applied at each of those touch locations.
- touch and force tracking may be performed on those touches identified in step 1056 .
- step 1060 the touch and force data may be further processed if necessary, otherwise it may be transmitted to the processing and control logic of the device for display and/or control of the device's intended purpose(s) in step 1062 .
- top or “north” channel or node will mean the channel or node above another channel or node
- bottom or “south” channel or node will mean the channel or node below another channel or node
- left” or “west” channel or node will mean the channel or node to the left of another channel or node
- right or “east” channel or node will mean the channel or node to the right of another channel or node.
- step 1102 the process of Touch Identification 504 ( FIG. 5 ) begins.
- step 1104 a self scan of all channels on one axis may be performed, e.g., either all columns or all rows.
- step 1106 the first self scan value may be examined.
- step 1108 the (first or subsequent) self scan value may be compared to a self touch threshold value.
- a self peak detection process 1100 may comprise steps 1110 to 1118 , and is part of the overall process of Peak Detection 510 ( FIG. 5 ). If the self scan value is less than the self touch threshold value as determined in step 1108 , then step 1238 ( FIG. 15 ) may determine whether there are any additional self scan values to be examined. However, if the self scan value is equal to or greater than the self touch threshold value as determined in step 1108 , then step 1110 may calculate a left slope between the self scan value and a self scan value of the channel to the left of the present channel. Then step 1112 may calculate a right slope between the self scan value and a self scan value of the channel to the right of the present channel.
- Step 1114 determines whether the left slope may be greater than zero (positive slope) and the right slope may be less than zero (negative slope), identifying a peak. If a yes result in step 1114 , then step 1120 may perform mutual scan measurements on each node of the channel selected from the self scan data. If a no result in step 1114 , then step 1116 determines whether the left slope may be greater than zero (positive slope) and greater than the right slope may be, for example but is not limited to, two times (twice) greater than the right slope. If a yes result in step 1116 , then in step 1120 mutual scan measurements may be performed on each node of the selected self scan channel.
- step 1118 determines whether the left slope may be, for example but is not limited to, less than zero (negative slope) and greater than a percentage of the right slope, e.g., fifty (50) percent. If a yes result in step 1116 , then step 1120 may perform mutual scan measurements on each node of the channel selected from the self scan data. If a no result in step 1116 , then step 1238 ( FIG. 15 ) may determine whether there are any additional columns to be examined based upon the self scan values thereof. Step 1122 may examine a first mutual scan value.
- a mutual peak detection process 1244 may comprise steps 1226 to 1234 , and is part of the overall Peak Detection process 510 ( FIG. 5 ).
- Step 1224 may compare the (first or subsequent) mutual scan value to a mutual touch threshold value, wherein if the mutual scan value is less than the mutual touch threshold value then step 1236 may determine whether there are any additional mutual scan values to be examined. However, if the mutual scan value is equal to or greater than the mutual touch threshold value then step 1226 may calculate a slope to the next mutual scan value node, then step 1228 may calculate a slope to the previous mutual scan value node.
- Step 1230 determines whether the next slope may be less than zero (negative slope) and the previous slope may be greater than zero (positive slope). If a yes result in step 1230 , then step 1350 ( FIG. 16 ) may start the process of Nudge 512 and/or the process of Interpolation 514 ( FIG. 5 ). If a no result in step 1230 , then step 1232 determines whether the next slope may be, for example but is not limited to, greater than zero (positive slope) and less than a percentage of the previous slope. If a yes result in step 1232 , then step 1350 ( FIG. 16 ) may start the process of Nudge 512 and/or the process of Interpolation 514 ( FIG. 5 ).
- step 1234 determines whether the next slope may be, for example but is not limited to, less than zero (negative slope) and greater than the previous slope. If a yes result in step 1234 , then step 1350 ( FIG. 13 ) may start the process of Nudge 512 and/or the process of Interpolation 514 ( FIG. 5 ). If a no result in step 1234 , then step 1236 determines whether there may be any additional mutual values to be examined. If a yes result in step 1236 , then step 1242 may examine a next mutual value. If a no result in step 1236 , then step 1238 determines whether there may be any additional self scan values to be examined.
- step 1240 examines a next self scan value that may be returned to step 1108 ( FIG. 14 ) for further processing thereof. If a no result in step 1238 , then in step 1244 a touch detection frame may be complete.
- Step 1350 may start the process of Nudge 512 and/or the process of Interpolation 514 by using a peak location from the process of Touch Identification 504 ( FIG. 5 ) and may comprise the following process steps: Step 1352 determines whether there may be a valid node to the north. If a no result in step 1352 , then continue to step 1360 . If a yes result in step 1352 , then step 1354 may make a mutual scan measurement of the node to the north. Step 1356 determines whether the mutual scan data of the north node may be greater than the current node. If a no result in step 1356 , then continue to step 1360 . If a yes result in step 1356 , then in step 1358 the north node may become the current node, and then continue to step 1486 ( FIG. 17 ).
- Step 1360 determines whether there may be a valid node to the south. If a no result in step 1360 , then continue to step 1470 ( FIG. 17 ). If a yes result in step 1360 , then step 1362 may make a mutual scan measurement of the node to the south. Step 1364 determines whether the mutual scan data of the south node may be greater than the current node. If a no result in step 1364 , then continue to step 1470 ( FIG. 17 ). If a yes result in step 1364 , then in step 1366 the south node may become the current node, and then continue to step 1486 ( FIG. 17 ).
- step 1470 determines whether there may be a valid node to the east. If a no result in step 1470 , then continue to step 1478 . If a yes result in step 1470 , then step 1472 may make a mutual scan measurement of the node to the east. Step 1474 determines whether the mutual scan data of the east node may be greater than the current node. If a no result in step 1474 , then continue to step 1478 . If a yes result in step 1474 , then in step 1476 the east node may become the current node, and then continue to step 1486 .
- Step 1478 determines whether there may be a valid node to the west. If a no result in step 1478 , then continue to step 1502 ( FIG. 18 ). If a yes result in step 1478 , then step 1480 may make a mutual measurement of the node to the west. Step 1482 determines whether the mutual scan data of the west node may be greater than the current node. If a no result in step 1482 , then continue to step 1502 ( FIG. 18 ). If a yes result in step 1482 , then in step 1484 the west node may become the current node. Step 1486 determines whether a touch point may already exist at the selected node. If a no result in step 1486 , then continue to step 1352 ( FIG. 16 ). If a yes result in step 1486 , then step 1488 may eliminate the current peak, and then continue to step 1236 ( FIG. 15 ).
- a flow diagram of a process of Interpolation 514 may comprise steps 1502 to 1518 .
- Step 1502 determines whether there may be a valid node to the left. If a no result in step 1502 , then continue to step 1510 wherein the left node value may be defined as a center value minus a right value then continue to step 1506 . If a yes result in step 1502 , then step 1504 may perform a mutual scan measurement on the node to the left. Then step 1506 determines whether there may be a valid node to the right. If a no result in step 1506 , then continue to step 1512 wherein the right node value may be defined as a center value minus a left value then continue to step 1516 .
- step 1508 may perform a mutual scan measurement on the node to the right.
- Step 1516 may determine a fine position by subtracting the left value from the right value, dividing the difference thereof by the center value, and then multiplying the result by, for example but is not limited to, the number 64. It is contemplated and within the scope and spirit of this disclosure that many ways of determining valid peaks and nodes may be used as one having ordinary skill in the art of touch detection and tracking could readily implement by having knowledge based upon the teachings of this disclosure
- step 1514 determines whether an Interpolation 514 may have been performed for each axis. If a no result in step 1514 , then step 1518 may interpolate another axis, thereafter steps 1502 to 1516 may be repeated, with “above” replacing “left” and “below” replacing “right” in each step. If a yes result in step 1514 , then step 1520 may add this touch point to a list of all detected touch points. Then step 1522 may return to step 1236 ( FIG. 15 ) for any additional mutual scan values to be examined.
- step 1550 starts the process of determining the force applied to the touch sensor 102 at that touch point.
- Untouched mutual capacitances of each point on the touch sensor 102 may be stored in a memory of the digital processor 106 after a “no touch” calibration scan of all points of the touch sensor 102 is performed.
- step 1552 that mutual capacitance change may be determined, and in step 1554 the mutual capacitance change may be converted into a force value.
- the force value may then be associated with the new touch point and stored in the list of all detected touches.
- step 1602 the process of Touch and Force Tracking 506 may start by using the previously found and current touch locations.
- step 1604 determines whether there may be any current touch locations. If a yes result in step 1604 , then step 1606 may select the first of the current touch locations, and thereafter may continue to step 1722 ( FIG. 21 ). If a no result in step 1604 , then step 1610 determines whether there may be any previous touch location(s). If a yes result in step 1610 , then step 1612 may select the first previous touch location. If a no result in step 1610 , then at step 1611 tracking is complete.
- Step 1614 determines whether the previous touch location may be associated with a current touch location. If a no result in step 1614 , then step 1608 may assert an output of “touch no longer present at previous touch location, stop tracking,” and then return to step 1616 . If a yes result in step 1614 , then step 1616 determines whether there may be any more previous touch locations. If a no result in step 1616 , then at step 1620 tracking touch locations is complete and the touch location data may be transmitted as Data Output 508 ( FIG. 5 ) for further processing by the microcontroller 112 ( FIG. 1 ). If a yes result in step 1616 , then step 1618 may select the next previous touch location, and thereafter return to step 1614 .
- step 1722 determines whether there may be any previous touch locations. If a no result in step 1722 , then continue to step 1868 ( FIG. 22 ) wherein a “New Touch to track is identified” at current location, and thereafter continue to step 1856 ( FIG. 22 ). If a yes result in step 1722 , then step 1724 may set a temporary weight value to a maximum weight value. Step 1726 may select the first of the previous touch locations. Then step 1728 may measure a distance between the selected current touch location and the selected previous touch location to determine a current distance (weight value) therebetween. Step 1730 determines whether the current weight value may be less than the temporary weight value.
- step 1732 may set the temporary weight value to the current weight value and thereafter may record the selected previous touch location as a temporary location and continue to step 1734 . If a no result in step 1730 , then step 1734 determines whether there may be more previous touch locations. If a yes result in step 1734 , then step 1736 may select the next previous touch location, and thereafter return to step 1728 . If a no result in step 1734 , then step 1738 determines whether the temporary location may have already been assigned to a different current location. If a yes result in step 1738 , then step 1740 may calculate a next worst weight value for the current location and for an assigned current location, and thereafter continue to step 1860 ( FIG. 22 ). If a no result in step 1738 , then continue to step 1850 ( FIG. 22 ).
- step 1850 determines whether the weight value may be below a maximum association threshold. If a no result in step 1850 , then step 1854 may identify a new touch location for tracking. If a yes result in step 1850 , then step 1852 may assign a new temporary location to the current location and then continue to step 1856 . Step 1860 determines whether the next worst weight value for the current location may be less than the next worst weight value for the assigned location. If a yes result in step 1860 , then step 1862 may set the temporary location to the next worst location and thereafter continue to step 1856 . If a no result in step 1860 , then step 1864 may set the assigned location to the next worst weight value.
- Step 1866 may select a moved assignment location and thereafter return to step 1722 ( FIG. 21 ).
- Step 1856 determines whether there may be more current touch locations. If a yes result in step 1856 , then step 1858 may select the next current touch location and thereafter return to step 1722 ( FIG. 21 ).
- Step 1902 may received a mutual scan location request.
- Step 1904 determines whether the mutual scan area location requested may be stored in the cache memory. If a yes result in step 1904 , then step 1920 determines whether the mutual scan data stored in the cache memory may be valid. If a yes result in step 1920 , then step 1922 may return mutual scan data to the cache memory. If a no result in step 1920 , then step 1918 may perform a mutual scan at the requested location, wherein step 1916 may write the mutual scan data to a location in the cache memory and then return back to step 1922 .
- step 1906 determines whether the requested touch location may be beyond the right edge of the cache. If a yes result in step 1906 , then step 1908 may de-allocate the left-most column of mutual scan data from the cache memory. In step 1910 the de-allocated mutual scan data may be allocated to the right edge of the cache memory so as to move the edge values thereof, and thereafter return to step 1904 . If a no result in step 1906 , then step 1914 may de-allocate the right-most column of data from the cache memory. In step 1912 the de-allocated mutual scan data may be allocated to the left edge of the cache memory so as to move the edge values thereof, and thereafter return to step 1904 .
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
- This application is a continuation-in-part of U.S. patent application Ser. No. 13/830,891; filed Mar. 14, 2013; which claims priority to U.S. Provisional Patent Application No. 61/617,831; filed Mar. 30, 2012. This application is a continuation-in-part of U.S. patent application Ser. No. 14/097,370; filed Dec. 5, 2013; which claims priority to U.S. Provisional Patent Application Ser. No. 61/777,910; filed Mar. 12, 2013; wherein all of which are hereby incorporated by reference herein for all purposes.
- The present disclosure relates to capacitive touch sensing, and more particularly, to touch sensing that determines both touch locations and pressure (force) applied at the touch locations.
- Human interface devices include touch control systems that are based on touch sensing surfaces, e.g., pads, screens, etc., using capacitive sensors that change capacitance values when touched. Transforming the touch(es) on the touch sensor into one or more touch locations is non-trivial. Tracking one or more touches on the touch sensor is also challenging. Advanced touch control systems are capable of detecting not only a single touch and/or movement on a touch sensing surface such as a touch screen, but also so-called multi-touch scenarios in which a user touches more than one location and/or moves more than one finger over the respective touch sensing surface, e.g., gesturing.
- Key challenges of multi-touch systems are: limited processing speed of low cost systems, such as processing capabilities of, for example but not limited to, 8-bit microcontroller architectures as these architectures may be unable to do advanced math for processing the respective signals generated by the touch sensing device. There may also exist limited touch scanning performance, for example the entire system may be unable to reasonably sample the entire plane of the touch sensor or screen every “frame.” Other challenges include having enough program memory space to provide for touch location determination programs that are concise, modular and general purpose. Limited random access memory (RAM) space may make the touch determination system unable to store multiple entire “images” of the touch detection and location(s) thereof simultaneously.
- Hence, there exists a need to improve and simplify touch determination methods. Conventional solutions were threshold based and required complex computations. Hence, there is a need for touch determination methods that are more robust and less computation intensive. Furthermore, there exists a need for high quality multi-touch decoding, in particular, a method and/or system that can be implemented with, for example but not limited to, a low-cost 8-bit micro controller architecture.
- Present technology touch sensors generally can only determine a location of a touch thereto, but not a force value of the touch to the touch sensing surface. Being able to determine not only the X-Y coordinate location of a touch but also the force of that touch gives another control option that may be used with a device having a touch sensing surface with such force sensing feature.
- The aforementioned problems are solved, and other and further benefits achieved by a touch location and force determining method and system disclosed herein.
- According to an embodiment, a method for decoding multiple touches and forces thereof on a touch sensing surface may comprise the steps of: scanning a plurality of channels aligned on an axis for determining self capacitance values of each of the plurality of channels; comparing the self capacitance values to determine which one of the channels has a local maximum self capacitance value; scanning a plurality of nodes of the at least one channel having the local maximum self capacitance value for determining mutual values of the nodes; comparing the mutual values to determine which one of the nodes has the largest mutual capacitance value, wherein the node having the largest mutual capacitance value on the local maximum self capacitance value channel may be a potential touch location; and determining a force at the potential touch location from a change in the mutual capacitance values of the node at the potential touch location during no touch and during a touch thereto.
- According to a further embodiment, the method may comprise the steps of: determining if at least one of the self values may be greater than a self touch threshold, wherein if yes then continue to the step of scanning a plurality of nodes of the at least one channel having the largest self value, and if no then end a touch detection frame as completed. According to a further embodiment, the method may comprise the steps of: determining left and right slope values for the at least one self value, wherein: the left slope value may be equal to the at least one self value minus a self value of a channel to the left of the at least one channel, and the right slope value may be equal to the at least one self value minus a self value of a channel to the right of the at least one channel.
- According to a further embodiment, the method may comprise the steps of: determining if the left slope value may be greater than zero (0) and the right slope value may be less than zero (0), wherein if yes then return to the step of scanning the plurality of nodes of the at least one channel, and if no then continue to next step; determining if the left slope value may be greater than zero (0) and greater than the right slope value, wherein if yes then return to the step of scanning the plurality of nodes of the at least one channel, and if no then continue to next step; determining if the left slope value may be less than zero (0) and greater than a percentage of the right slope value, wherein if yes then return to the step of scanning the plurality of nodes of the at least one channel, and if no then continue to next step; determining if there may be another self value, wherein if yes then return to the step of determining if at least one of the self values may be greater than the self touch threshold value using the another self value, and if no then end a touch detection frame as completed.
- According to a further embodiment, the method may comprise the steps of: determining if at least one of the mutual values may be greater than a mutual touch threshold, wherein if yes then continue to the step of scanning a plurality of nodes of the at least one channel having the largest self value, and if no then end the touch detection frame as completed. According to a further embodiment, the method may comprise the steps of: determining a next slope value, wherein the next slope value may be equal to a current mutual value minus a next mutual value of a next node; and determining a previous slope value, wherein the previous slope value may be equal to the current mutual value minus a previous mutual value of a previous node.
- According to a further embodiment, the method may comprise the steps of: determining if the next slope value may be less than zero (0) and the previous slope value may be greater than zero (0), wherein if yes then begin the step of validating the node, and if no then continue to next step; determining if the next slope value may be greater than zero (0) and less than a percentage of the previous slope value, wherein if yes then begin the step of validating the node, and if no then continue to next step; determining if the next slope value may be less than zero (0) and greater than the previous slope value, wherein if yes then begin the step of validating the node, and if no then continue to next step; determining if there may be another mutual value, wherein if yes then return to the step of determining if at least one of the mutual values may be greater than the mutual touch threshold, and if no then continue to the next step; and determining if there may be another self value, wherein if yes then examine another self value and return to the step of determining if at least one of the self values may be greater than a self touch threshold, and if no then end the touch detection frame as completed.
- According to a further embodiment of the method, the step of validating the node may comprise the steps of: identifying the node having a local maximum mutual value as a current node; determining if there may be a valid node north of the current node, wherein if no then continue to the step of determining if there may be a valid node south of the current node, and if yes then perform a mutual measurement on the north node and continue to the next step; determining if the north node may be greater then the current node, if yes then make the north node the current node and continue to the step of determining whether a touch point already exists at this node, and if no then continue to the next step; determining if there may be a valid node south of the current node, wherein if no then continue to the step of determining if there may be a valid node east of the current node, and if yes then perform a mutual measurement on the south node and continue to the next step; determining if the south node may be greater then the current node, wherein if yes then make the south node the current node and continue to the step of determining whether a touch point already exists at this node, and if no then continue to the next step; determining if there may be a valid node east of the current node, wherein if no then continue to the step of determining if there may be a valid node west of the current node, and if yes then perform a mutual measurement on the east node and continue to the next step; determining if the east node may be greater then the current node, if yes then make the east node the current node and continue to the step of determining whether a touch point already exists at this node, and if no then continue to the next step; determining if there may be a valid node west of the current node, wherein if no then continue to the step of determining if there may be a valid node left of the current node, and if yes then perform a mutual measurement on the west node and continue to the next step; determining if the west node may be greater then the current node, if yes then make the west node the current node and continue to the step of determining whether a touch point already exists at this node, and if no then continue to the next step; determining if there may be a valid node left of the current node, wherein if no then define a left mutual value as a center mutual value minus a right mutual value and continue to the step of determining a fine position for the node, and if yes then perform a mutual measurement on the left node and continue to the next step; determining if there may be a valid node right of the current node, wherein if no then define the mutual value as the center mutual value minus the left mutual value and continue to the step of determining the fine position for the node, and if yes then perform a mutual measurement on the right node and continue to the next step; defining a fine position of the node by subtracting the left value from the right value, dividing this difference by the center value and multiplying the result thereof by 64 and continue to the next step; and determining whether interpolation was performed for each axis, wherein if yes, then add another touch point to a list of all detected touch points and return to the step of determining if there may be additional mutual values, and if no, then interpolate an other axis by using left and right nodes of the other axis for starting again at the step of determining if there may be a valid node left of the current node.
- According to another embodiment, a system for determining gesturing motions and forces thereof on a touch sensing surface having a visual display may comprise: a first plurality of electrodes arranged in a parallel orientation having a first axis, wherein each of the first plurality of electrodes may comprise a self capacitance; a second plurality of electrodes arranged in a parallel orientation having a second axis substantially perpendicular to the first axis, the first plurality of electrodes may be located over the second plurality of electrodes and form a plurality of nodes may comprise overlapping intersections of the first and second plurality of electrodes, wherein each of the plurality of nodes may comprise a mutual capacitance; a flexible electrically conductive cover over the first plurality of electrodes, wherein a face of the flexible electrically conductive cover forms the touch sensing surface; a plurality of deformable spacers between the flexible electrically conductive cover and the first plurality of electrodes, wherein the plurality of deformable spacers maintains a distance between the flexible electrically conductive cover and the first plurality of electrodes; a digital processor and memory, wherein digital outputs of the digital processor may be coupled to the first and second plurality of electrodes; an analog front end coupled to the first and second plurality of electrodes; an analog-to-digital converter (ADC) having at least one digital output coupled to the digital processor; wherein values of the self capacitances may be measured for each of the first plurality of electrodes by the analog front end, the values of the measured self capacitances may be stored in the memory; values of the mutual capacitances of the nodes of at least one of the first electrodes having at least one of the largest values of self capacitance may be measured by the analog front end, the values of the measured mutual capacitances may be stored in the memory; and the digital processor uses the stored self and mutual capacitance values for determining a gesturing motion and at least one force associated therewith applied to the touch sensing surface.
- According to a further embodiment, the digital processor, memory, analog front end and ADC may be provided by a digital device. According to a further embodiment, the digital device may comprise a microcontroller. According to a further embodiment, the flexible electrically conductive cover may comprise a flexible metal substrate. According to a further embodiment, the flexible electrically conductive cover may comprise a flexible non-metal substrate and an electrically conductive coating on a surface thereof. According to a further embodiment, the flexible electrically conductive cover may comprise a substantially light transmissive flexible substrate and a coating of Indium Tin Oxide (ITO) on a surface of the flexible substrate. According to a further embodiment, the flexible electrically conductive cover may comprise a substantially light transmissive flexible substrate and a coating of Antimony Tin Oxide (ATO) on a surface of the flexible substrate.
- According to yet another embodiment of the method for determining the gesturing motion and the at least one force associated therewith may comprise the step of selecting an object shown in the visual display by touching the object with a first force. According to a further embodiment, the method may comprise the step of locking the object in place by touching the object with a second force. According to a further embodiment, the method may comprise the step of releasing the lock on the object by touching the object with a third force and moving the touch in a direction across the touch sensing surface. According to a further embodiment, the method may comprise the step of releasing the lock on the object by removing the touch at a first force to the object and then touching the object again at a second force. According to a further embodiment of the method, the second force may be greater than the first force.
- According to still another embodiment, a method for determining the gesturing motion and the at least one force associated therewith may comprise the steps of: touching a right portion of an object shown in the visual display with a first force; touching a left portion of the object with a second force; wherein when the first force may be greater than the second force the object rotates in a first direction, and when the second force may be greater than the first force the object rotates in a second direction.
- According to a further embodiment of the method, the first direction may be clockwise and the second direction may be counter-clockwise. According to a further embodiment of the method, when the touch at the left portion of the object moves toward the right portion of the object the object rotates in a third direction, and when the touch at the right portion of the object moves toward the left portion of the object may rotate in a fourth direction. According to a further embodiment of the method, the first and second directions may be substantially perpendicular to the third and fourth directions.
- According to a further embodiment of the method for determining the gesturing motion and the at least one force associated therewith may comprise the step of: changing a size of an object shown in the visual display by touching a portion of the object with a force, wherein the greater the force the large the size of the object becomes. According to a further embodiment of the method, the size of the object may be fixed when the touch and the force may be moved off of the object. According to a further embodiment of the method, the size of the object varies in proportion to the amount of force applied to the object.
- According to a further embodiment of the method for determining the gesturing motion and the at least one force associated therewith may comprise the step of: handling pages of a document shown in the visual display by touching a portion of the document with a force sufficient to flip through the pages. According to a further embodiment of the method, the step of removing a currently visible page may further comprise the step of moving the touch at the currently visible page in a first direction parallel with the touch sensing surface. According to a further embodiment of the method, the step of inserting the removed page into a new document may comprise the step of touching the removed page with the force near the new document.
- According to a further embodiment of the method for determining the gesturing motion and the at least one force associated therewith may comprise the step of changing values of an alpha-numeric character shown in the visual display by touching the alpha-numeric character with different forces, wherein a first force will cause the alpha-numeric character to increment and a second force will cause the alpha-numeric character to decrement. According to a further embodiment of the method, the value of the alpha-numeric character may be locked when the touch may be moved off of the alpha-numeric character and parallel to the touch sensing surface.
- According to a further embodiment of the method for determining the gesturing motion and the at least one force associated therewith may comprise the steps of: incrementing a value of an alpha-numeric character shown in the visual display by touching an upper portion of the alpha-numeric character with a force; and decrementing the value of the alpha-numeric character by touching an lower portion of the alpha-numeric character with the force. According to a further embodiment of the method, the value of the alpha-numeric character may be locked when the touch may be moved off of the alpha-numeric character and parallel to the touch sensing surface. According to a further embodiment of the method, a speed of incrementing or decrementing the value of the alpha-numeric character may be proportional to a magnitude of the force applied to upper portion or lower portion, respectively, of the alpha-numeric character. According to a further embodiment of the method, the alpha-numeric character may be a number. According to a further embodiment of the method, the alpha-numeric character may be a letter of an alphabet.
- A more complete understanding of the present disclosure thereof may be acquired by referring to the following description taken in conjunction with the accompanying drawings wherein:
-
FIG. 1 illustrates a schematic block diagram of an electronic system having a capacitive touch sensor, a capacitive touch analog front end and a digital processor, according to the teachings of this disclosure; -
FIG. 2 illustrates schematic elevational views of metal over capacitive touch sensors, according to the teachings of this disclosure; -
FIG. 3 illustrates a schematic elevational view of a touch sensor capable of detecting both locations of touches thereto and forces of those touches, according to the teachings of this disclosure; -
FIGS. 4A to 4D illustrate schematic plan views of touch sensors having various capacitive touch sensor configurations, according to the teachings of this disclosure; -
FIGS. 4E and 4F illustrate schematic plan views of self and mutual capacitive touch detection of a single touch to a touch sensor, according to the teachings of this disclosure; -
FIGS. 4G to 4K illustrate schematic plan views of self and mutual capacitive touch detection of two touches to a touch sensor, according to the teachings of this disclosure; -
FIG. 5 illustrates a schematic process flow diagram for multi-touch and force decoding of a touch sensor as shown inFIG. 1 , according to specific example embodiments of this disclosure; -
FIG. 6 illustrates a graph of single touch peak detection data, according to specific example embodiments of this disclosure; -
FIG. 7 illustrates a schematic plan diagram of potential touch and mutual touch locations of a touch sensor, according to specific example embodiments of this disclosure; -
FIG. 8 illustrates a schematic plan view diagram of a touch sensor showing a cache data window thereof, according to specific example embodiments of this disclosure; -
FIG. 9 illustrates a graph of self scan values and a table of mutual scan values for two touch peak detection data, according to specific example embodiments of this disclosure; -
FIGS. 10 and 11 illustrate schematic diagrams of historic and current point locations used for a point weighting example, according to the teachings of this disclosure; -
FIG. 12 illustrates schematic drawings of a normal finger touch and a flat finger touch, according to the teachings of this disclosure; -
FIGS. 13 to 23 illustrate schematic process flow diagrams for touch decoding and force determination of the decoded touch(es), according to specific example embodiments of this disclosure; -
FIG. 24 illustrates a schematic plan view of a finger of a hand touching a surface of a touch sensor, according to a specific example embodiment of this disclosure; -
FIG. 25 illustrates a schematic plan view of two fingers of a hand touching a surface of a touch sensor, according to another specific example embodiment of this disclosure; -
FIG. 26 illustrates a schematic plan view of a finger of a hand touching an object projected on a surface of a touch sensor, according to yet another specific example embodiment of this disclosure; -
FIG. 27 illustrates a schematic plan view of a finger of a hand touching a document projected on a surface of a touch sensor, according to still another specific example embodiment of this disclosure; and -
FIG. 28 illustrates a schematic plan view of a finger of a hand touching one digit of a number projected on a surface of a touch sensor, according to another specific example embodiment of this disclosure. - While the present disclosure is susceptible to various modifications and alternative forms, specific example embodiments thereof have been shown in the drawings and are herein described in detail. It should be understood, however, that the description herein of specific example embodiments is not intended to limit the disclosure to the particular forms disclosed herein, but on the contrary, this disclosure is to cover all modifications and equivalents as defined by the appended claims.
- According to various embodiments, a series of optimized processes may be provided that scan a plurality of (electrically) conductive columns and rows arranged in a matrix on a surface, e.g., touch sensor display or panel, and which identify and track a plurality of touches thereto and forces thereof. These processes may be further optimized for operation with a low cost 8-bit microcontroller, according to specific embodiments of this disclosure.
- Once a touch has been established, a force thereof may be assigned to the touch based upon the magnitude of change of the capacitance values determined during scans of a touch sensor, as more fully described hereinabove. Also the touch forces applied to the touch sensor from the associated tracked touch points may be utilized in further determining three dimensional gesturing, e.g., X, Y and Z positions and forces, respectively. For example, proportional force at a touch location(s) allows three dimensional control of an object projected onto a screen of the touch sensor. Differing pressures on multiple points, e.g., during more then one touch (multiple fingers touching face of touch sensor), allows object rotation control. A touch at a certain force may allow selecting an object(s) and a touch at a difference, e.g., greater force, may be used to fix the location(s) of the object(s) on the display of the touch sensor.
- Rocking multi-touch presses to produce varying touch forces may be used for rotation of an object. A vertical motion, e.g., vertical sliding, press may be used to scale a vertical size of an object. A horizontal motion, e.g., horizontal sliding, press may be used to scale a horizontal size of an object. Touches with varying force may be used to flip through pages of a document. A varying force may be used to insert a page into a stack of pages of a document. A vertical or horizontal gesture and force may be used to activate a function, e.g., empty trash bin icon. Varying touch pressure may be used to lift a page off of a document for transmission to another display. Varying touch pressure may change the scope of a gesture movement, e.g., selecting a picture instead of the full document. Pressing with a sweeping gesture may be used for an object release and discard. Varying touch pressures may be used to select alpha-numeric characters or drop function boxes.
- According to various embodiments, these processes utilize both self and mutual scans to perform an optimized scan of the plurality of conductive columns and rows used for touch sensing. Using that as the basis, the proposed processes may use a subset of the data from the plurality of conductive columns and rows in order to do all necessary processing for touch location identification and tracking. The various embodiments specifically focus on a low-resource requirement solution for achieving touch location identification and tracking.
- According to various embodiments, self capacitances of either the conductive columns or rows may be measured first then mutual capacitances of only those conductive columns or rows may be measured in combination with the other axis of conductive rows or columns. The various embodiments disclosed herein overcome the problem of transforming these self and mutual capacitance measurements into one or more touches and forces thereof, and tracking these one or more touches and forces thereof through multiple frames of the capacitance measurements of the conductive columns or rows as described hereinabove.
- According to various embodiments, at least one process may scan a plurality of conductive columns and rows arranged in a matrix, detect and track up to N touches, using various unique techniques disclosed and claimed herein. A process of peak detection examines slope ratios to accurately and quickly determine peak measurements. According to various embodiments, the challenge of tracking multiple touch locations may be solved through time on associated ones of the plurality of conductive columns or rows.
- The various embodiments may allow for N touches to compensate for touches of different finger positions, e.g., such as a flat finger, that prevents missed touches and substantially eliminates incorrect touches.
- According to various embodiments, a process is provided for quickly identifying accurate touches instead of only looking at true peaks, wherein a “virtual” peak may be found by examining slope ratios using various techniques disclosed herein for touch identification. A combination of unique processes, according to the teachings of this disclosure, may be used to achieve better accuracy and speed improvements for multi-touch decoding. For example, a peak detection process may be implemented as a “fuzzy” peak detection process that examines slope relationships, not just signs of the slopes between the conductive columns measured. Furthermore, a so-called “nudge technique” may be used that “nudges” a potential touch location to a best location by examining adjacent values thereto. “Windowed” data cache may be used to accelerate processing in a low capacity RAM environment, e.g., 8-bit microcontroller. Interpolation may be used to increase the touch location resolution based upon measured values adjacent thereto. Multi-touch tracking may be used to identify N touches through time. Multi-touch tracking may be used to track N touches through time. Weighted matching may be used in a weighting method to best match touch points over time. “Area” detection may use a process that allows easy area and/or pressure detection based upon the sum of the nudged values for a given touch location.
- Significant accuracy and speed of decoding improvements may use a combination of novel techniques for use in a low memory capacity and low cost digital processor, e.g., microcontroller, microprocessor, digital signal processor (DSP), application specific integrated circuit (ASIC), programmable logic array (PLA), etc. Various embodiments may track eight or more touches and forces thereof on, for example but not limited to, a 3.5 inch touch sensor capacitive sensor array. For example when using a Microchip PIC18F46K22 (64K ROM, <4K RAM) microcontroller.
- Referring now to the drawings, the details of example embodiments are schematically illustrated. Like elements in the drawings will be represented by like numbers, and similar elements will be represented by like numbers with a different lower case letter suffix.
- Referring to
FIG. 1 , depicted is a schematic block diagram of an electronic system having a capacitive touch sensor, a capacitive touch analog front end and a digital processor, according to the teachings of this disclosure. Adigital device 112 may comprise a digital processor andmemory 106, an analog-to-digital converter (ADC)controller 108, and a capacitive touch analog front end (AFE) 110. Thedigital device 112 may be coupled to atouch sensor 102 comprised of a plurality ofconductive columns 104 androws 105 arranged in a matrix and having a flexible electricallyconductive cover 103 thereover. It is contemplated and within the scope of this disclosure that theconductive rows 105 and/orconductive columns 104 may be, for example but are not limited to, printed circuit board conductors, wires, Indium Tin Oxide (ITO) or Antimony Tin Oxide (ATO) coatings on a clear substrate, e.g., display/touch screen, etc., or any combinations thereof. The flexible electricallyconductive cover 103 may comprise metal, conductive non-metallic material, ITO or ATO coating on a flexible clear substrate (plastic), etc. Thedigital device 112 may comprise a microcontroller, microprocessor, digital signal processor, application specific integrated circuit (ASIC), programmable logic array (PLA), etc., and may further comprise one or more integrated circuits (not shown), packaged or unpackaged. - Referring to
FIG. 2 , depicted are schematic elevational views of metal over capacitive touch sensors, according to the teachings of this disclosure. Acapacitive sensor 238 is on asubstrate 232. On either side of thecapacitive sensor 238 arespacers 234, and an electrically conductiveflexible cover 103, e.g., metal, ITO or ATO coated plastic, etc.; is located on top of thespacers 234 and forms achamber 236 over thecapacitive sensor 238. When aforce 242 is applied to a location on theflexible cover 103, theflexible cover 103 moves toward thecapacitive sensor 238, thereby increasing the capacitance thereof. The capacitance value(s) of the capacitive sensor(s) 238 is measured and an increase in capacitance value thereof will indicate the location of the force 242 (e.g., touch). The capacitance value of thecapacitive sensor 238 will increase the closer theflexible cover 103 moves toward the face of thecapacitive sensor 238. Metal over capacitive touch technology is more fully described in Application Note AN1325, entitled “mTouch™ Metal over Cap Technology” by Keith Curtis and Dieter Peter, available www.microchip.com; and is hereby incorporated by reference herein for all purposes. - Referring to
FIG. 3 , depicted is a schematic elevational view of a touch sensor capable of detecting both locations of touches thereto and forces of those touches, according to the teachings this disclosure. A touch sensor capable of detecting both a location of a touch(es) thereto and a force(s) of that touch(es) thereto, generally represented by the numeral 102, may comprise a plurality ofconductive rows 105, a plurality ofconductive columns 104, a plurality ofdeformable spacers 334, and a flexible electricallyconductive cover 103. - The
conductive columns 104 and theconductive rows 105 may be used in determining a location(s) of a touch(es), more fully described in Technical Bulletin TB3064, entitled “mTouch™ Projected Capacitive Touch Screen Sensing Theory of Operation” referenced hereinabove, and the magnitude of changes in the capacitance values of the conductive column(s) 104 at and around the touch location(s) may be used in determining the force 242 (amount of pressure applied at the touch location). The plurality ofdeformable spacers 334 may be used to maintain a constant spacing between the flexibleconductive cover 103 and a front surface of theconductive columns 104 when noforce 242 is being applied to the flexible electricallyconductive cover 103. Whenforce 242 is applied to a location on the flexible electricallyconductive cover 103, the flexible electricallyconductive cover 103 will be biased toward at least oneconductive column 104, thereby increasing the capacitance thereof. Direct measurements of capacitance values and/or ratios of the capacitance values may be used in determining the magnitude of theforce 242 being applied at the touch location(s). - Referring back to
FIG. 1 ,digital devices 112, e.g., microcontrollers, now include peripherals that enhance the detection and evaluation of such capacitive value changes. More detailed descriptions of various capacitive touch system applications are more fully disclosed in Microchip Technology Incorporated application notes AN1298, AN1325 and AN1334, available at www.microchip.com, and all are hereby incorporated by reference herein for all purposes. - One such application utilizes the capacitive voltage divider (CVD) method to determine a capacitance value and/or evaluate whether the capacitive value has changed. The CVD method is more fully described in Application Note AN1208, available at www.microchip.com; and a more detailed explanation of the CVD method is presented in commonly owned United States Patent Application Publication No. US 2010/0181180, entitled “Capacitive Touch Sensing using an Internal Capacitor of an Analog-To-Digital Converter (ADC) and a Voltage Reference,” by Dieter Peter; wherein both are hereby incorporated by reference herein for all purposes.
- A Charge Time Measurement Unit (CTMU) may be used for very accurate capacitance measurements. The CTMU is more fully described in Microchip application notes AN1250 and AN1375, available at www.microchip.com, and commonly owned U.S. Pat. No. 7,460,441 B2, entitled “Measuring a long time period;” and U.S. Pat. No. 7,764,213 B2, entitled “Current-time digital-to-analog converter,” both by James E. Bartling; wherein all of which are hereby incorporated by reference herein for all purposes.
- It is contemplated and within the scope of this disclosure that any type of capacitance measurement circuit having the necessary resolution may be used in determining the capacitance values of the plurality of
conductive columns 104 and nodes (intersections ofcolumns 104 and rows 105), and that a person having ordinary skill in the art of electronics and having the benefit of this disclosure could implement such a capacitance measurement circuit. - Referring to
FIGS. 4A to 4D , depicted are schematic plan views of touch sensors having various capacitive touch sensor configurations, according to the teachings of this disclosure.FIG. 4A showsconductive columns 104 andconductive rows 105. Each of theconductive columns 104 has a “self capacitance” that may be individually measured when in a quiescent state, or all of theconductive rows 105 may be actively excited while each one of theconductive columns 104 has self capacitance measurements made thereof. Active excitation of all of theconductive rows 105 may provide a stronger measurement signal for individual capacitive measurements of theconductive columns 104. - For example, if there is a touch detected on one of the
conductive columns 104 during a self capacitance scan, then only thatconductive column 104 having the touch detected thereon need be measured further during a mutual capacitance scan thereof. The self capacitance scan may only determine which one of theconductive columns 104 has been touched, but not at what location along the axis of thatconductive column 104 where it was touched. The mutual capacitance scan may determine the touch location along the axis of thatconductive column 104 by individually exciting (driving) one at a time theconductive rows 105 and measuring a mutual capacitance value for each one of the locations on thatconductive column 104 that intersects (crosses over) theconductive rows 105. There may be an insulating non-conductive dielectric (not shown) between and separating theconductive columns 104 and theconductive rows 105. Where theconductive columns 104 intersect with (crossover) theconductive rows 105,mutual capacitors 120 are thereby formed. During the self capacitance scan above, all of theconductive rows 105 may be either grounded or driven with a logic signal, thereby forming individual column capacitors associated with each one of theconductive columns 104. -
FIGS. 4B and 4C show interleaving of diamond shaped patterns of theconductive columns 104 and theconductive rows 105. This configuration may maximize exposure of each axis conductive column and/or row to a touch (e.g., better sensitivity) with a smaller overlap between theconductive columns 104 and theconductive rows 105.FIG. 4D shows receiver (top) conductive rows (e.g., electrodes) 105 a and transmitter (bottom)conductive columns 104 a comprising comb like meshing fingers. Theconductive columns 104 a and conductive rows 105 a are shown in a side-by-side plan view, but normally the top conductive rows 105 a would be over the bottomconductive columns 104 a. Self and mutual capacitive touch detection is more fully described in Technical Bulletin TB3064, entitled “mTouch™ Projected Capacitive Touch Screen Sensing Theory of Operation” by Todd O'Connor, available at www.microchip.com; and commonly owned United States Patent Application Publication No. US 2012/0113047, entitled “Capacitive Touch System Using Both Self and Mutual Capacitance” by Jerry Hanauer; wherein both are hereby incorporated by reference herein for all purposes. - Referring to
FIGS. 4E and 4F , depicted are schematic plan views of self and mutual capacitive touch detection of a single touch to a touch sensor, according to the teachings of this disclosure. InFIG. 4E a touch, represented by a picture of a part of a finger, is at approximately the coordinates of X05, Y07. During self capacitive touch detection each one of the rows Y01 to Y09 may be measured to the determine the capacitance values thereof. Note that baseline capacitance values with no touches thereto for each one of the rows Y01 to Y09 have been taken and stored in a memory (e.g.,memory 106—FIG. 1 ). Any significant capacitance change to the baseline capacitance values of the rows Y01 to Y09 will be obvious and taken as a finger touch. In the example shown inFIG. 4E the finger is touching row Y07 and the capacitance value of that row will change, indicating a touch thereto. However it is still unknown from the self capacitance measurements where on this row that the touch has occurred. - Once the touched row (Y07) has been determined using the self capacitance change thereof, mutual capacitive detection may be used in determining where on the touched row (Y07) the touch has occurred. This may be accomplished by exciting, e.g., putting a voltage pulse on, each of the columns X01 to X12 one at a time while measuring the capacitance value of row Y07 when each of the columns X01 to X12 is individually excited. The column (X05) excitation that causes the largest change in the capacitance value of row Y07 will be the location on that row which corresponds to the intersection of column X05 with row Y07, thus the single touch is at point or node X05, Y07. Using self and mutual capacitance touch detection significantly reduces the number of row and column scans to obtain the X,Y touch coordinate on the
touch sensor 102. In this example, nine (9) rows were scanned during self capacitive touch detection and twelve (12) columns were scanned during mutual capacitive touch detection for a total number of 9+12=21 scans. If individual x-y capacitive touch sensors for each node (location) were used then 9×12=108 scans would be necessary to find this one touch, a significant difference. It is contemplated and within the scope of this disclosure that the self capacitances of the columns X01 to X21 may be determined first then mutual capacitances determined of a selected column(s) by exciting each row Y01 to Y09 to find the touch location on the selected column(s). - Referring to
FIGS. 4G to 4K , depicted are schematic plan views of self and mutual capacitive touch detection of two touches to a touch sensor, according to the teachings of this disclosure. InFIG. 4G two touches, represented by a picture of parts of two fingers, are at approximately the coordinates of X05, Y07 fortouch # 1 and X02, Y03 fortouch # 2. During self capacitive touch detection each one of the rows Y01 to Y09 may be measured to the determine the capacitance values thereof. Note that baseline capacitance values with no touches thereto for each one of the rows Y01 to Y09 have been taken and stored in a memory (e.g.,memory 106—FIG. 1 ). Any significant capacitance changes to the baseline capacitance values of the rows Y01 to Y09 will be obvious and taken as finger touches. In the example shown inFIG. 4H the first finger is touching row Y07 and the second finger is touching row Y03, wherein the capacitance values of those two rows will change, indicating touches thereto. However it is still unknown from the self capacitance measurements where on these two row that the touches have occurred. - Once the touched rows (Y07 and Y03) have been determined using the self capacitance changes thereof, mutual capacitive detection may be used in determining where on these two touched rows (Y07 and Y03) the touches have occurred. Referring to
FIG. 4I , this may be accomplished by exciting, e.g., putting a voltage pulse on, each of the columns X01 to X12 one at a time while measuring the capacitance value of row Y07 when each of the columns X01 to X12 is individually excited. The column (X05) excitation that causes the largest change in the capacitance value of row Y07 will be the location on that row that corresponds to the intersection of column X05 with row Y07. Referring toFIG. 4J , likewise measuring the capacitance value of row Y03 when each of the columns X01 to X12 is individually excited determines where on column Y03 thetouch # 2 has occurred. Referring toFIG. 4K , the two touches are at points or nodes (X05, Y07) and (X02, Y03). It is contemplated and within the scope of this disclosure that if the capacitances of more then one of the selected rows, e.g., Y07 and Y03, can be measured simultaneously, then only one set of individual column X01 to X12 excitations is needed in determining the two touches to thetouch sensor 102. - Referring to
FIG. 5 , depicted is a schematic process flow diagram for multi-touch and force decoding of a touch sensor as shown inFIG. 1 , according to specific example embodiments of this disclosure. A process of multi-touch decoding may comprise the steps ofData Acquisition 502,Touch Identification 504,Force Identification 505, Touch andForce Tracking 506, andData Output 508. The step ofTouch Identification 504 may further comprise the steps ofPeak Detection 510, Nudge 512 andInterpolation 514, more fully described hereinafter. - Data Acquisition.
-
Data Acquisition 502 is the process of taking self capacitance measurements of the plurality ofconductive columns 104 orconductive rows 105, and then mutual capacitance measurements of selected ones of the plurality ofconductive columns 104 orconductive rows 105, and intersections of the plurality ofconductive rows 105 orconductive columns 104, respectively therewith, to acquire touch identification data. The touch identification data may be further processed to locate potential touches and forces thereto on thetouch sensor 102 using the process ofTouch Identification 504 andForce Identification 505, respectively, as more fully described hereinafter. - Touch Identification
-
Touch Identification 504 is the process of using the touch identification data acquired during the process ofData Acquisition 502 to locate potential touches on thetouch sensor 102. The following are a sequence of process steps to determine which ones of the plurality ofconductive columns 104 orconductive rows 105 to select that have a touch(es) thereto using self capacitance measurements thereof, and where on the selectedconductive columns 104 orconductive rows 105 the touch(es) may have occurred using mutual capacitance measurements thereof. - Peak Detection
-
Peak detection 510 is the process of identifying where potential touch locations may be on thetouch sensor 102. However according to the teachings of this disclosure, instead of only looking at actual detected “peaks,” peak detection may purposely be made “fuzzy,” e.g., identifying potential peaks by looking for ratios of differences of slope values as well as slope “signs,” not just a low-high-low value sequence. A “virtual” peak may be detected by examining slope ratios, e.g., 2:1 slope ratio, wherein a change in slope may be identified as a potential peak. This may be repeated until no additional peaks are detected. - Nudge
- Nudge 512 is the process of examining each adjacent location of a potential touch location once it has been identified. If the adjacent location(s) has a greater value than the existing touch potential location then eliminate the current potential touch location and identify the adjacent location having the greater value as the potential touch location (see
FIG. 8 and the description thereof hereinafter). - Interpolation
- Once a touch location has been identified,
Interpolation 514 is the process that examines the adjacent values to generate a higher resolution location. - Force Identification
-
Force Identification 505 is the process of using some of the touch identification data acquired during the process ofData Acquisition 502 in combination with the potential touch locations identified during the process ofTouch Identification 504. The mutual capacitance measurements associated with the potential touch locations, determined during the process ofTouch Identification 504, may be compared with reference capacitance values of those same locations with no touches applied thereto (smaller capacitance values). The magnitude of a capacitance change may thereby be used in determining the force applied by the associated potential touch previously determined. - Touch and Force Tracking
- Touch and
Force Tracking 506 is the process of comparing time sequential “frames” of touch identification data and then determining which touches are associated between sequential frames. A combination of weighting and “best guess” matching may be used to track touches and forces thereof through multiple frames during the process ofData Acquisition 502 described hereinabove. This is repeated for every peak detected and every touch that was identified on the previous frame. A “frame” is the set of self and mutual capacitive measurements of the plurality ofconductive columns 104 orconductive rows 105 in order to capture a single set of touches at a specific time. Each full set of scans (a “frame”) of the self and mutual capacitance measurements of the plurality ofconductive columns 104 orconductive rows 105 to acquire touch identification data of thetouch sensor 102 at a given time associated with that frame. - Touch and
Force Tracking 506 associates a given touch in one frame with a given touch in a subsequent frame. Touch and Force tracking may create a history of touch frames, and may associate the touch locations of a current frame with the touch locations of a previous frame or frames. In order to associate a previous touch location to a current potential touch location a “weighting” function may be used. The weight values (“weight” and “weight values” will be used interchangeably herein) between time sequential touch locations (of different frames) represent the likelihood that time sequential touch locations (of different frames) are associated with each other. Distance calculations may be used to assign weight values between these associated touch locations. A “true” but complex and processor intensive calculation for determining weight value between touch locations is: -
Weight value=SQRT[(X previous −X current)2+(Y previous −Y current)2] Eq. (1) - A simplified distance (weight value) calculation may be used that measures ΔX and ΔY and then sums them together:
-
Weight value′=ABS(X previous −X current)+ABS(Y previous −Y current) Eq. (2) - The above simplified weight value calculation, Eq. (2), creates a diamond shaped pattern for a given weight value instead of a circular pattern of the more complex weight value calculation, Eq. (1). Use of Eq. (2) may be optimized for speed of the weight value calculations in a simple processing system, distance may be calculated based upon the sum of the change of the X-distances and the change in the Y-distances, e.g., Eq. (2) herein above. A better weight value may be defined as a smaller distance between sequential touch locations.
- For each new touch location a weight value may be calculated for all touch locations from the previous frame. The new touch location is then associated with the previous touch location having the best weight value therebetween. If the previous touch location already has an associated touch location from a previous frame, a secondary second-best weight value for each touch location may be examined. The touch location with the lower-cost second-best weight value may then be shifted to its second best location, and the other touch location may be kept as the best touch location. This process is repeated until all touch locations have been associated with previous frame touch locations, or have been identified as “new touches” having new locations with no touch locations from the previous frame being close to the new touch location(s).
- An alternative to the aforementioned weighting process may be a vector-based process utilizing a vector created from the previous two locations to create the most likely next location. This vector-based weighting process may use the same distance calculations as the aforementioned weighting process, running it from multiple points and modifying the weight values based upon from which point the measurement was taken.
- By looking at the previous two locations of a touch, the next “most likely” location of that touch may be predicted. Once the extrapolated location has been determined that location may be used as the basis for a weighting value. To improve matching on the extrapolated location an “acceleration model” may be used to add weighting points along the vector to the extrapolated locations and past the extrapolated locations. These additional points assist in detecting changes in speed of the touch movement, but may not be ideal for determining direction of the touch motion.
- Once the touch locations have been established, forces thereto may be assigned to these touch locations based upon the magnitude of change of the capacitance values determined during the process of
Data Acquisition 502, as more fully described hereinabove. Also the forces applied to thetouch sensor 102 from the associated tracked touch points may be utilized in further determining three dimensional gesturing, e.g., X-Y and Z directions. - Referring to
FIGS. 10 and 11 , depicted are schematic diagrams of historic and current point locations used for a point weighting example, according to the teachings of this disclosure. Once weights have been generated, the best combination of weight values and associated touches may be generated. Certain touch scenarios may cause nearly identical weight values, in which case the second best weight values should be compared and associations appropriately shifted. Depending upon the order of operations, points A and D may be associated first. As the weight values for B are generated BD is a better match then BC. In this case look at secondary weight values. Is it less costly to shift A to be associated with C or to shift B to be associated with C? - By extending this sequence of operations, all points can have associations shifted for the best overall match, not just the best local match. Some caution may be needed to prevent infinite loops of re-weighing. This may be accomplished by limiting the number of shifts to a finite number. Referring now to
FIG. 11 , points A and B are existing points, and points 1 and 2 are “new” points that need to be associated. - Step 1) Calculate weight values between touch locations:
-
- A1 weight=5 ((ΔX=2)+(ΔY=3)=5)
- A2 weight=4
- B1 weight=10
- B2 weight=5
Step 2) Select the “best” pair (lowest weight) for each existing touch location: - A>2 weight=4 and B>2 weight=5
Step 3) If more than one existing touch location pairs with a given new touch location, then look at the second-best touch locations for each and the difference in weight values from the best to the second best pair (the “cost”). - A1 (weight: 5) Cost=1: (A1 weight)−(A2 weight 4)
- B1 (weight: 10) Cost=5: (B1 weight)−(B2 weight 5)
Step 4) Shift the pairing to the lowest cost pair thereby allowing the other touch location to maintain the original pairing. - A1
- B2
Step 5) Repeat steps 2) through 4) until all pairing are 1:1. If there are more touch locations than existing touch locations then start tracking a new touch location. If fewer new touch locations than existing “worst match” touch locations then these worst match touch locations may be lost and no longer tracked.
- Flat Finger Identification
- Referring to
FIG. 12 , depicted are schematic drawings of a normal finger touch and a flat finger touch, according to the teachings of this disclosure. One challenge of identifying a touch is the “flat finger” scenario. This is when the side or flat part of afinger 1020, rather then thefinger tip 1022, is placed on thetouch sensor 102. Note that aflat finger 1020 may generate two or morepotential touch locations flat finger 1020 by accumulating the sum of the values of all nodes nudged to each peak. If the sum of these values surpasses a threshold then it is likely caused by a flat finger touch. If a flat finger touch is detected then other touches that are near the flat finger peak(s) may be suppressed. In addition, comparing the forces associated with the two or morepotential touch locations flat finger 1020 situation. - Data Output
- Referring back to
FIG. 5 ,Data Output 508 is the process of providing determined touch location coordinates and associated forces applied thereto in a data packet(s) to a host system for further processing. - Touch Determination
- Given an array of touch data, examine the differences between the values thereof and flag certain key scenarios as potential peaks for further examination. All touch data values below a threshold value may be ignored when determining touch locations.
- Key Scenario 1: True Peak
- Referring to
FIG. 6 , identify the transition from a positive to a negative slope as a potential peak. This would be the point circled incolumn 7 of the example data values shown inFIG. 6 . - Key Scenario 2: Slope Ratio Beyond Threshold (“Fuzzy” Peak Detection)
- A key threshold of slope ratios may be used to flag additional peaks. The threshold value used may be, for example but is not limited to, 2:1; so instances where there is a change of slope greater than 2:1 may be identified as potential peaks. This applies to positive and negative slopes. This would be the point circled in
column 6 of the example data values shown inFIG. 6 . - Why not Just Look at the Slope Signs?
- Since the self scan is only one axis of a two-axis sensor array (e.g.,
conductive rows 105 andconductive columns 104 oftouch sensor 102,FIG. 1 ), it is possible for two touches that are off by a single “bar” (e.g., column) to only show a single peak. With the example data, there could be two touches, one at 6,6 and another at 7,7 (seeFIGS. 6 and 9 ). Without the additional peak detection, the touch at 6,3 may not be detected. - Once a potential touch location is identified, each adjacent touch location may be examined to determine if they have a greater value. If a greater value is present, eliminate the current potential touch location and identify the touch location of the greater value as a potential touch location. This process is repeated until a local peak has been identified.
- Referring to
FIG. 6 , depicted is a graph of single touch peak detection data, according to specific example embodiments of this disclosure. An example graph of data values for one column (e.g., column 7) of thetouch sensor 102 is shown wherein a maximum data value determined from the self and mutual capacitance measurements ofcolumn 7 occurs at thecapacitive touch sensor 104 area located arow 7,column 7. All data values that are below a threshold value may be ignored, e.g., below about 12 in the graphical representation shown inFIG. 6 . Therefore only data values taken at row 6 (data value=30) and at row 7 (data value=40) need be processed in determining the location of a touch to thetouch sensor 102. Slope may be determined by subtracting a sequence of adjacent row data values in a column to produce either a positive or negative slope value. When the slope value is positive the data values are increasing, and when the slope value is negative the data values are decreasing. A true peak may be identified as a transition from a positive to a negative slope as a potential peak. A transition from a positive slope to a negative slope is indicated atdata value 422 of the graph shown inFIG. 6 . - However another touch may have occurred at
column 6 and was not directly measured in thecolumn 7 scan, but shows up asdata value 420 during thecolumn 7 scan. Without another test besides the slope sign transition, the potential touch atcolumn 6 may be missed. Therefore a threshold of slope ratios may further be used to flag additional potential peaks. Slope is the difference between two data values of adjacentconductive columns 104. This threshold of slope ratios may be, for example but is not limited to, 2:1 so instances where there is a change of slope greater than 2:1 may be identified as another potential peak. This may apply to both positive and negative slopes. For example, thedata value 420, taken atrow 6, has a left slope of 23:1 (30−7) and a right slope of 10:1 (40−30). Thedata value 422, taken atrow 7, has a left slope of 10:1 (40−30) and right slope of −30:1 (10−40). The slope ratio forrow 6 of 23:10, exceeds the example 2:1 threshold and would be labeled for further processing. All other data values are below the data value threshold and may be ignored. - Referring to
FIG. 7 , depicted is a schematic plan diagram of potential touch and mutual touch locations of a touch sensor, according to specific example embodiments of this disclosure. Once a potential touch location is identified, each adjacent location thereto may be examined to determine whether any one of them may have a greater data value than the current potential touch location (labeled “C” inFIGS. 7( a) & 7(b)). If a greater data value is found, then the current potential touch location may be eliminated and the touch location having the greater value may be identified as a potential touch location. This is referred to herein as the process ofNudge 512 and may be repeated until a data peak has been identified. - During a data acquisition scan of a column of rows, only tier one nodes (labeled “1” in
FIGS. 7( a) and 7(b)—adjacent locations to the current potential touch location) are examined. If any of these tier one nodes has a larger data value than the data value of the current potential touch location, a new current touch location is shifted (“nudged”) to that node having the highest data value and the process ofNudge 512 is repeated. If a tier one node is already associated with a different potential peak, then no further searching is necessary and the current data peak may be ignored. Tier two nodes (labeled “2” inFIGS. 7( a) & 7(b)—adjacent locations to the tier one nodes) are examined when there is a potential of a large area activation of thetouch sensor 102. - After one
conductive column 104 has been scanned for mutual capacitance values, the process ofNudge 512 may be speeded up by storing the mutual capacitance data values of that one column in a cache memory, then doing theNudge 512 first on the tier one nodes, and then on the tier two nodes of that one column from the mutual capacitance data values stored in the cache memory. Then only after there are no further nudges to do in that one column will the process ofNudge 512 examine the tier one and tier two nodes from the mutual capacitance measurement scans of the two each adjacent columns on either side of the column having the process ofNudge 512 performed thereon. - Interpolation of the potential touch location may be performed by using the peak data value node (touch location) as well as each adjacent node thereto (e.g., tier one nodes from a prior Nudge 512) to create sub-steps between each node. For example, but not limited to, 128 steps may be created between each node. Referring to
FIG. 7( c), node A is the potential touch location and nodes B, C, D and E are tier one nodes adjacent thereto. The interpolated X, Y location may be found using the following equations: -
Locationx=(D Value −B Value)/A Value*64 -
Locationy=(E Value −C Value)/A Value*64 - It is contemplated and within the scope of this disclosure that variations of the above equations may be used based upon the ratio of values and the signs of the numerator of the division.
- Referring to
FIG. 8 , depicted is a schematic plan view diagram of a touch sensor showing a cache data window thereof, according to specific example embodiments of this disclosure. Theconductive columns 104 of thetouch sensor 102 may be scanned column by column for self capacitance values until allconductive columns 104 have been scanned. Eachconductive column 104 indicating a potential touch from the self capacitance data may be sequentially scanned for determining mutual capacitive values thereof (touch data) and when peaks are discovered they may be processed contemporaneously with the column scan. Furthermore, touch data may be stored in a cache memory for further processing. Since theNudge 512 looks at the first tier nodes then the second tier nodes, if necessary, not all of the touch data from all of theconductive columns 104 need be stored at one time. This allows a simple caching system using a minimum amount of random access memory (RAM). For example, storing five columns of touch data in a cache. The five columns are contiguous and a cache window may move across thecolumns 104 of thetouch sensor 102 onecolumn 104 at a time. It is contemplated and within the scope of this disclosure that more or fewer than five columns of touch data may be stored in a cache memory and processed therefrom, and/or self capacitance scanning by rows instead of columns may be used instead. All descriptions herein may be equally applicable to self capacitance scanning of rows then mutual capacitance scanning by columns of those row(s) selected from the self capacitance scan data. - Whenever a Mutual Scan of a first or second tier node (capacitive sensor 104) is requested, it may be called first from the cache memory. If the requested node touch data is present in the cache memory, the cache memory returns the requested touch data of that first or second tier node. However, if the requested touch data is not present in the cache memory then the following may occur: 1) If the column of the requested touch data is in the range of the cache window then perform the mutual scan of that column and add the touch data to the cache memory, or 2) If the column of the requested touch data is not in the range of the present cache window then shift the cache window range and perform the mutual scan of the new column and add the resulting touch data from the new cache window to the cache memory.
- Referring to
FIG. 9 , depicted are a graph of self scan values and a table of mutual scan values for two touch peak detection data, according to specific example embodiments of this disclosure. Since a self scan is performed in only one axis (e.g., one column), it is possible for two touches that are off by a single column to only show a single peak. For the example data values shown inFIG. 9 , two touches may have occurred, one at selfscan data value 422 and the other indicated at selfscan data value 420. Without being aware of change of slopes greater than 2:1, the potential touch represented by self scan data value 420 may have been missed. A first touch may causedata value 422 and a second touch may causedata value 420. The processes ofPeak Detection 510 and Nudge 512 (FIG. 5 ), as described hereinabove, may further define these multiple touches as described herein. Once each multiple touch has been defined a force thereof may be determined and associated its respective touch. - Referring to
FIG. 24 , depicted is schematic plan view of a finger of a hand touching a surface of a touch sensor, according to a specific example embodiment of this disclosure. A hand of a user, generally represented by the numeral 2400, may hover over a face of atouch sensor 102, e.g., touch screen or panel, having a plurality of locations that when at least one of the plurality of locations is touched by afinger 2402 of thehand 2400, the location and on the face of thetouch sensor 102 force thereto is detected and stored for further processing as disclosed herein. For example, a light touch of thefinger 2402 on the face of thetouch sensor 102 may select an object (not shown) displayed by a visual display integral therewith. Upon thefinger 2402 pressing a little harder at the touch location the selected object may be locked in place. Pressing even harder on the locked object and then gesturing to move the object may release the lock on the object. Another example, pressing on the object (not shown) selects the object, then pressing harder fixes the object's location. Releasing the pressure (force) on the object then pressing hard on the object again would release the object to move again. - Referring to
FIG. 25 , depicted is schematic plan view of two fingers of a hand touching a surface of a touch sensor, according to another specific example embodiment of this disclosure. Afinger 2504 over a left portion of thetouch sensor 102 and anotherfinger 2506 over a right portion of thetouch sensor 102 may be used to rotate an object (not shown) displayed by a visual display integral therewith. For example, when the left orientedfinger 2504 presses harder than the right orientedfinger 2506 the object may rotate counterclockwise about an axis parallel with the axis of the wrist/arm. When the right orientedfinger 2506 presses harder than the left orientedfinger 2504 the object may rotate clockwise about the axis parallel with the axis of the wrist/arm. When the wrist is rotated while thefingers touch sensor 102, the object (not shown) may rotate substantially perpendicular to the axis of the wrist/arm (substantially parallel with the face of the touch sensor 102) and in the direction of the rotation of thefingers - Referring to
FIG. 26 , depicted is schematic plan view of a finger of a hand touching an object projected on a surface of a touch sensor, according to yet another specific example embodiment of this disclosure. Pressing on the face of thetouch sensor 102 over anobject 2608 with afinger 2402 may be used to scale the size of the object. For example, the greater the force of the press (touch) by thefinger 2402 the larger in size that the object may be displayed. The object may remain at the new larger size or may vary in size in proportion to the force applied to the face of the touch sensor, e.g., a harder press will result in a larger in size object and a softer press will result in a smaller in size object. The size of the object may follow the amount of force applied by thefinger 2402 to the face of thetouch sensor 102. - Referring to
FIG. 27 , depicted is schematic plan view of a finger of a hand touching a document projected on a surface of a touch sensor, according to still another specific example embodiment of this disclosure. Adocument 2710 may displaced on a face of thetouch sensor 102. A touch of sufficient force by thefinger 2402 to a portion of thedocument 2710 may be used to flip through pages thereof. Afinger 2402 movement, for example but not limited to, the right may remove currently visible page(s) of thedocument 2710. Pressing on a removed page near another new document (not shown) may be used to flip through the new document (not shown) and/or may allow insertion of the remove page into the new document. For example, pressing on adocument 2710 flips through a stack of document pages. If thefinger 2402 then moves off the document the selected page may be removed. Pressing on a single page next to a document may flip through the document and then may insert the page when it is drug over the document. - Referring to
FIG. 28 , depicted is schematic plan view of a finger of a hand touching one digit of a number projected on a surface of a touch sensor, according to another specific example embodiment of this disclosure. At least one number or letter, e.g., alpha-numeric character 2814, may be displayed on the face of the touch sensor. Afinger 2402 may press on a portion of thecharacter 2814 wherein the amount of force by thefinger 2402 may cause thecharacter 2814 to increase or decrease alpha-numerically in value, accordingly. When thecharacter 2814 is a desired value then thefinger 2402 may slide off, e.g., up, down or sideways, to leave editing of thecharacter 2814. An increase in the alpha-numeric value may be controlled by pressing thefinger 2402 on an upper portion of thecharacter 2814, and a decrease in the alpha-numeric value may be controlled by pressing thefinger 2402 on a lower portion of thecharacter 2814. The speed of increase or decrease of the alpha-numeric value may be proportional to the amount of force applied by thefinger 2402 to surface of thetouch sensor 102. More than one finger may be used to contemporaneously increase and or decrease more than one alpha-numeric character. For example, afinger 2402 may be pressed on asingle digit 2814 of a number (124779 shown), whereby thesingle digit 2814 sequentially flips through numerical values, e.g., 0-9. When a desired numerical value is displaced, thefinger 2402 may be dragged off the digit to leave the selected numerical value. - Referring to
FIGS. 13 to 23 , depicted are schematic process flow diagrams for touch decoding and force determination of the decoded touch(es), according to specific example embodiments of this disclosure.FIG. 13 shows a general overview of possible processes for multi-touch decoding and force determination for atouch sensor 102 enabled device. It is contemplated and within the scope of this disclosure that more, fewer and/or some different processes may be utilized with atouch sensor 102 enabled device and still be within the scope, intent and spirit of this disclosure. In step 1050 a device is started, actuated, etc., when instep 1052 power is applied to the device. Instep 1054 the device may be initialized, and thereafter instep 1056 the process ofTouch Identification 504 may begin. Once the process ofTouch Identification 504 instep 1056 has determined the touch locations,step 1057 determines the force applied at each of those touch locations. Instep 1058 touch and force tracking may be performed on those touches identified instep 1056. Instep 1060 the touch and force data may be further processed if necessary, otherwise it may be transmitted to the processing and control logic of the device for display and/or control of the device's intended purpose(s) instep 1062. - In the descriptions of the following process steps references to “top” or “north” channel or node will mean the channel or node above another channel or node, “bottom” or “south” channel or node will mean the channel or node below another channel or node, “left” or “west” channel or node will mean the channel or node to the left of another channel or node, and “right” or “east” channel or node will mean the channel or node to the right of another channel or node.
- Referring to
FIG. 14 , a flow diagram of a process ofTouch Identification 504 is shown and described hereinafter. Instep 1102 the process of Touch Identification 504 (FIG. 5 ) begins. In step 1104 a self scan of all channels on one axis may be performed, e.g., either all columns or all rows. Instep 1106 the first self scan value may be examined. Instep 1108 the (first or subsequent) self scan value may be compared to a self touch threshold value. - A self
peak detection process 1100 may comprisesteps 1110 to 1118, and is part of the overall process of Peak Detection 510 (FIG. 5 ). If the self scan value is less than the self touch threshold value as determined instep 1108, then step 1238 (FIG. 15 ) may determine whether there are any additional self scan values to be examined. However, if the self scan value is equal to or greater than the self touch threshold value as determined instep 1108, then step 1110 may calculate a left slope between the self scan value and a self scan value of the channel to the left of the present channel. Then step 1112 may calculate a right slope between the self scan value and a self scan value of the channel to the right of the present channel. -
Step 1114 determines whether the left slope may be greater than zero (positive slope) and the right slope may be less than zero (negative slope), identifying a peak. If a yes result instep 1114, then step 1120 may perform mutual scan measurements on each node of the channel selected from the self scan data. If a no result instep 1114, then step 1116 determines whether the left slope may be greater than zero (positive slope) and greater than the right slope may be, for example but is not limited to, two times (twice) greater than the right slope. If a yes result instep 1116, then instep 1120 mutual scan measurements may be performed on each node of the selected self scan channel. If a no result instep 1116, then step 1118 determines whether the left slope may be, for example but is not limited to, less than zero (negative slope) and greater than a percentage of the right slope, e.g., fifty (50) percent. If a yes result instep 1116, then step 1120 may perform mutual scan measurements on each node of the channel selected from the self scan data. If a no result instep 1116, then step 1238 (FIG. 15 ) may determine whether there are any additional columns to be examined based upon the self scan values thereof.Step 1122 may examine a first mutual scan value. - Referring to
FIG. 15 , a mutualpeak detection process 1244 may comprisesteps 1226 to 1234, and is part of the overall Peak Detection process 510 (FIG. 5 ).Step 1224 may compare the (first or subsequent) mutual scan value to a mutual touch threshold value, wherein if the mutual scan value is less than the mutual touch threshold value then step 1236 may determine whether there are any additional mutual scan values to be examined. However, if the mutual scan value is equal to or greater than the mutual touch threshold value then step 1226 may calculate a slope to the next mutual scan value node, then step 1228 may calculate a slope to the previous mutual scan value node. -
Step 1230 determines whether the next slope may be less than zero (negative slope) and the previous slope may be greater than zero (positive slope). If a yes result instep 1230, then step 1350 (FIG. 16 ) may start the process ofNudge 512 and/or the process of Interpolation 514 (FIG. 5 ). If a no result instep 1230, then step 1232 determines whether the next slope may be, for example but is not limited to, greater than zero (positive slope) and less than a percentage of the previous slope. If a yes result instep 1232, then step 1350 (FIG. 16 ) may start the process ofNudge 512 and/or the process of Interpolation 514 (FIG. 5 ). If a no result instep 1232, then step 1234 determines whether the next slope may be, for example but is not limited to, less than zero (negative slope) and greater than the previous slope. If a yes result instep 1234, then step 1350 (FIG. 13 ) may start the process ofNudge 512 and/or the process of Interpolation 514 (FIG. 5 ). If a no result instep 1234, then step 1236 determines whether there may be any additional mutual values to be examined. If a yes result instep 1236, then step 1242 may examine a next mutual value. If a no result instep 1236, then step 1238 determines whether there may be any additional self scan values to be examined. If a yes result instep 1238, then step 1240 examines a next self scan value that may be returned to step 1108 (FIG. 14 ) for further processing thereof. If a no result instep 1238, then in step 1244 a touch detection frame may be complete. - Referring to
FIGS. 16-18 , flow diagrams of processes forNudge 512 and Interpolation 514 (FIG. 5 ) are shown and described hereinafter.Step 1350 may start the process ofNudge 512 and/or the process ofInterpolation 514 by using a peak location from the process of Touch Identification 504 (FIG. 5 ) and may comprise the following process steps:Step 1352 determines whether there may be a valid node to the north. If a no result instep 1352, then continue to step 1360. If a yes result instep 1352, then step 1354 may make a mutual scan measurement of the node to the north.Step 1356 determines whether the mutual scan data of the north node may be greater than the current node. If a no result instep 1356, then continue to step 1360. If a yes result instep 1356, then instep 1358 the north node may become the current node, and then continue to step 1486 (FIG. 17 ). -
Step 1360 determines whether there may be a valid node to the south. If a no result instep 1360, then continue to step 1470 (FIG. 17 ). If a yes result instep 1360, then step 1362 may make a mutual scan measurement of the node to the south.Step 1364 determines whether the mutual scan data of the south node may be greater than the current node. If a no result instep 1364, then continue to step 1470 (FIG. 17 ). If a yes result instep 1364, then instep 1366 the south node may become the current node, and then continue to step 1486 (FIG. 17 ). - Referring to
FIG. 17 ,step 1470 determines whether there may be a valid node to the east. If a no result instep 1470, then continue to step 1478. If a yes result instep 1470, then step 1472 may make a mutual scan measurement of the node to the east.Step 1474 determines whether the mutual scan data of the east node may be greater than the current node. If a no result instep 1474, then continue to step 1478. If a yes result instep 1474, then instep 1476 the east node may become the current node, and then continue to step 1486. -
Step 1478 determines whether there may be a valid node to the west. If a no result instep 1478, then continue to step 1502 (FIG. 18 ). If a yes result instep 1478, then step 1480 may make a mutual measurement of the node to the west.Step 1482 determines whether the mutual scan data of the west node may be greater than the current node. If a no result instep 1482, then continue to step 1502 (FIG. 18 ). If a yes result instep 1482, then instep 1484 the west node may become the current node.Step 1486 determines whether a touch point may already exist at the selected node. If a no result instep 1486, then continue to step 1352 (FIG. 16 ). If a yes result instep 1486, then step 1488 may eliminate the current peak, and then continue to step 1236 (FIG. 15 ). - Referring to
FIG. 18 , a flow diagram of a process ofInterpolation 514 may comprisesteps 1502 to 1518.Step 1502 determines whether there may be a valid node to the left. If a no result instep 1502, then continue to step 1510 wherein the left node value may be defined as a center value minus a right value then continue to step 1506. If a yes result instep 1502, then step 1504 may perform a mutual scan measurement on the node to the left. Then step 1506 determines whether there may be a valid node to the right. If a no result instep 1506, then continue to step 1512 wherein the right node value may be defined as a center value minus a left value then continue to step 1516. If a yes result instep 1506, then step 1508 may perform a mutual scan measurement on the node to the right.Step 1516 may determine a fine position by subtracting the left value from the right value, dividing the difference thereof by the center value, and then multiplying the result by, for example but is not limited to, the number 64. It is contemplated and within the scope and spirit of this disclosure that many ways of determining valid peaks and nodes may be used as one having ordinary skill in the art of touch detection and tracking could readily implement by having knowledge based upon the teachings of this disclosure - After
step 1516 has completed the aforementioned calculations,step 1514 determines whether anInterpolation 514 may have been performed for each axis. If a no result instep 1514, then step 1518 may interpolate another axis, thereafter steps 1502 to 1516 may be repeated, with “above” replacing “left” and “below” replacing “right” in each step. If a yes result instep 1514, then step 1520 may add this touch point to a list of all detected touch points. Then step 1522 may return to step 1236 (FIG. 15 ) for any additional mutual scan values to be examined. - Referring to
FIG. 19 , a flow diagram of a process ofForce Identification 505 is shown and described hereinafter. After a new touch point is added in step 1520 (FIG. 18 ),step 1550 starts the process of determining the force applied to thetouch sensor 102 at that touch point. Untouched mutual capacitances of each point on thetouch sensor 102 may be stored in a memory of thedigital processor 106 after a “no touch” calibration scan of all points of thetouch sensor 102 is performed. When a force is applied to a touch location, the value of the mutual capacitance of that touch location will increase. Instep 1552 that mutual capacitance change may be determined, and instep 1554 the mutual capacitance change may be converted into a force value. Once this force value is determined, instep 1556 the force value may then be associated with the new touch point and stored in the list of all detected touches. - Referring to
FIGS. 20 , 21 and 22, flow diagrams of a process of Touch andForce Tracking 506 are shown and described hereinafter. Instep 1602 the process of Touch andForce Tracking 506 may start by using the previously found and current touch locations.Step 1604 determines whether there may be any current touch locations. If a yes result instep 1604, then step 1606 may select the first of the current touch locations, and thereafter may continue to step 1722 (FIG. 21 ). If a no result instep 1604, then step 1610 determines whether there may be any previous touch location(s). If a yes result instep 1610, then step 1612 may select the first previous touch location. If a no result instep 1610, then atstep 1611 tracking is complete. -
Step 1614 determines whether the previous touch location may be associated with a current touch location. If a no result instep 1614, then step 1608 may assert an output of “touch no longer present at previous touch location, stop tracking,” and then return tostep 1616. If a yes result instep 1614, then step 1616 determines whether there may be any more previous touch locations. If a no result instep 1616, then atstep 1620 tracking touch locations is complete and the touch location data may be transmitted as Data Output 508 (FIG. 5 ) for further processing by the microcontroller 112 (FIG. 1 ). If a yes result instep 1616, then step 1618 may select the next previous touch location, and thereafter return tostep 1614. - Referring to
FIG. 21 ,step 1722 determines whether there may be any previous touch locations. If a no result instep 1722, then continue to step 1868 (FIG. 22 ) wherein a “New Touch to track is identified” at current location, and thereafter continue to step 1856 (FIG. 22 ). If a yes result instep 1722, then step 1724 may set a temporary weight value to a maximum weight value.Step 1726 may select the first of the previous touch locations. Then step 1728 may measure a distance between the selected current touch location and the selected previous touch location to determine a current distance (weight value) therebetween.Step 1730 determines whether the current weight value may be less than the temporary weight value. If a yes result instep 1730, then step 1732 may set the temporary weight value to the current weight value and thereafter may record the selected previous touch location as a temporary location and continue to step 1734. If a no result instep 1730, then step 1734 determines whether there may be more previous touch locations. If a yes result instep 1734, then step 1736 may select the next previous touch location, and thereafter return tostep 1728. If a no result instep 1734, then step 1738 determines whether the temporary location may have already been assigned to a different current location. If a yes result instep 1738, then step 1740 may calculate a next worst weight value for the current location and for an assigned current location, and thereafter continue to step 1860 (FIG. 22 ). If a no result instep 1738, then continue to step 1850 (FIG. 22 ). - Referring to
FIG. 22 ,step 1850 determines whether the weight value may be below a maximum association threshold. If a no result instep 1850, then step 1854 may identify a new touch location for tracking. If a yes result instep 1850, then step 1852 may assign a new temporary location to the current location and then continue to step 1856.Step 1860 determines whether the next worst weight value for the current location may be less than the next worst weight value for the assigned location. If a yes result instep 1860, then step 1862 may set the temporary location to the next worst location and thereafter continue to step 1856. If a no result instep 1860, then step 1864 may set the assigned location to the next worst weight value.Step 1866 may select a moved assignment location and thereafter return to step 1722 (FIG. 21 ).Step 1856 determines whether there may be more current touch locations. If a yes result instep 1856, then step 1858 may select the next current touch location and thereafter return to step 1722 (FIG. 21 ). - Referring to
FIG. 23 , depicted is a process flow diagram for a column cache, according to specific example embodiments of this disclosure.Step 1902 may received a mutual scan location request.Step 1904 determines whether the mutual scan area location requested may be stored in the cache memory. If a yes result instep 1904, then step 1920 determines whether the mutual scan data stored in the cache memory may be valid. If a yes result instep 1920, then step 1922 may return mutual scan data to the cache memory. If a no result instep 1920, then step 1918 may perform a mutual scan at the requested location, whereinstep 1916 may write the mutual scan data to a location in the cache memory and then return back tostep 1922. - If a no result in
step 1904, then step 1906 determines whether the requested touch location may be beyond the right edge of the cache. If a yes result instep 1906, then step 1908 may de-allocate the left-most column of mutual scan data from the cache memory. Instep 1910 the de-allocated mutual scan data may be allocated to the right edge of the cache memory so as to move the edge values thereof, and thereafter return tostep 1904. If a no result instep 1906, then step 1914 may de-allocate the right-most column of data from the cache memory. Instep 1912 the de-allocated mutual scan data may be allocated to the left edge of the cache memory so as to move the edge values thereof, and thereafter return tostep 1904. - While embodiments of this disclosure have been depicted, described, and are defined by reference to example embodiments of the disclosure, such references do not imply a limitation on the disclosure, and no such limitation is to be inferred. The subject matter disclosed is capable of considerable modification, alteration, and equivalents in form and function, as will occur to those ordinarily skilled in the pertinent art and having the benefit of this disclosure. The depicted and described embodiments of this disclosure are examples only, and are not exhaustive of the scope of the disclosure.
Claims (37)
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/254,098 US9430107B2 (en) | 2012-03-30 | 2014-04-16 | Determining touch locations and forces thereto on a touch and force sensing surface |
CN201580009205.9A CN106030463A (en) | 2014-04-16 | 2015-04-15 | Determining touch locations and forces thereto on touch and force sensing surface |
JP2016550560A JP2017511520A (en) | 2014-04-16 | 2015-04-15 | Determination of the touch location on the touch and force sensing surface and the force there |
KR1020167024023A KR20160144967A (en) | 2014-04-16 | 2015-04-15 | Determining touch locations and forces thereto on a touch and force sensing surface |
EP15719103.2A EP3132330B1 (en) | 2014-04-16 | 2015-04-15 | Determining touch locations and forces thereto on a touch and force sensing surface |
PCT/US2015/025968 WO2015160948A1 (en) | 2014-04-16 | 2015-04-15 | Determining touch locations and forces thereto on a touch and force sensing surface |
TW104112257A TWI669650B (en) | 2014-04-16 | 2015-04-16 | Determining touch locations and forces thereto on a touch and force sensing surface |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261617831P | 2012-03-30 | 2012-03-30 | |
US201361777910P | 2013-03-12 | 2013-03-12 | |
US13/830,891 US9207820B2 (en) | 2012-03-30 | 2013-03-14 | Method and system for multi-touch decoding |
US14/097,370 US20140267152A1 (en) | 2013-03-12 | 2013-12-05 | Force Sensing X-Y Touch Sensor |
US14/254,098 US9430107B2 (en) | 2012-03-30 | 2014-04-16 | Determining touch locations and forces thereto on a touch and force sensing surface |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/830,891 Continuation-In-Part US9207820B2 (en) | 2012-03-30 | 2013-03-14 | Method and system for multi-touch decoding |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140210791A1 true US20140210791A1 (en) | 2014-07-31 |
US9430107B2 US9430107B2 (en) | 2016-08-30 |
Family
ID=51222402
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/254,098 Expired - Fee Related US9430107B2 (en) | 2012-03-30 | 2014-04-16 | Determining touch locations and forces thereto on a touch and force sensing surface |
Country Status (1)
Country | Link |
---|---|
US (1) | US9430107B2 (en) |
Cited By (59)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140267061A1 (en) * | 2013-03-12 | 2014-09-18 | Synaptics Incorporated | System and method for pre-touch gestures in sensor devices |
US8908894B2 (en) | 2011-12-01 | 2014-12-09 | At&T Intellectual Property I, L.P. | Devices and methods for transferring data through a human body |
US20150193047A1 (en) * | 2013-09-10 | 2015-07-09 | Cypress Semiconductor Corporation | Interleaving sense elements of a capacitive-sense array |
CN105117055A (en) * | 2015-08-14 | 2015-12-02 | 宸鸿科技(厦门)有限公司 | Touch pressed type three-dimensional signal input device, application method and multi-functional touch control panel |
US20150355788A1 (en) * | 2013-03-01 | 2015-12-10 | Lenovo (Beijing) Co., Ltd. | Method and electronic device for information processing |
US20160085324A1 (en) * | 2014-09-24 | 2016-03-24 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
WO2016058342A1 (en) * | 2014-10-15 | 2016-04-21 | 京东方科技集团股份有限公司 | Display device and drive method therefor |
US9349280B2 (en) | 2013-11-18 | 2016-05-24 | At&T Intellectual Property I, L.P. | Disrupting bone conduction signals |
US9405892B2 (en) | 2013-11-26 | 2016-08-02 | At&T Intellectual Property I, L.P. | Preventing spoofing attacks for bone conduction applications |
WO2016130951A1 (en) * | 2015-02-12 | 2016-08-18 | Qualcomm Technologies, Inc. | Integrated touch and force detection |
US9430043B1 (en) | 2000-07-06 | 2016-08-30 | At&T Intellectual Property Ii, L.P. | Bioacoustic control system, method and apparatus |
WO2016138535A1 (en) * | 2015-02-27 | 2016-09-01 | Tactual Labs Co. | Multitouch frame matching with distance fields |
US9454253B2 (en) * | 2014-08-01 | 2016-09-27 | Hideep Inc. | Smartphone |
US9486027B2 (en) | 2014-10-17 | 2016-11-08 | Guardhat, Inc. | Connection assembly for adjoining a peripheral with a host wearable device |
US9501195B1 (en) | 2015-07-27 | 2016-11-22 | Hideep Inc. | Smartphone |
US9535529B2 (en) | 2014-09-19 | 2017-01-03 | Hideep Inc. | Smartphone |
US9578148B2 (en) | 2014-09-19 | 2017-02-21 | Hideep Inc. | Smartphone capable of detecting touch position and pressure |
US9582071B2 (en) | 2014-09-10 | 2017-02-28 | At&T Intellectual Property I, L.P. | Device hold determination using bone conduction |
EP3136210A1 (en) * | 2015-08-31 | 2017-03-01 | HiDeep Inc. | Pressure detector capable of adjusting pressure sensitivity and touch input device including the same |
US9589482B2 (en) | 2014-09-10 | 2017-03-07 | At&T Intellectual Property I, L.P. | Bone conduction tags |
US9594433B2 (en) | 2013-11-05 | 2017-03-14 | At&T Intellectual Property I, L.P. | Gesture-based controls via bone conduction |
US9600079B2 (en) | 2014-10-15 | 2017-03-21 | At&T Intellectual Property I, L.P. | Surface determination via bone conduction |
US9612265B1 (en) | 2011-09-23 | 2017-04-04 | Cypress Semiconductor Corporation | Methods and apparatus to detect a conductive object |
US9715774B2 (en) | 2013-11-19 | 2017-07-25 | At&T Intellectual Property I, L.P. | Authenticating a user on behalf of another user based upon a unique body signature determined through bone conduction signals |
CN106980423A (en) * | 2016-01-19 | 2017-07-25 | 瑞鼎科技股份有限公司 | Self-capacitance touch and pressure sensing device and self-capacitance touch and pressure sensing method |
WO2017201338A1 (en) * | 2016-05-18 | 2017-11-23 | Sensel Inc. | Method for detecting and confirming a touch input |
KR20180009369A (en) * | 2018-01-10 | 2018-01-26 | 주식회사 하이딥 | Pressure detector capable of pressure sensitivity adjustment and touch input depvice including the same |
US9882992B2 (en) | 2014-09-10 | 2018-01-30 | At&T Intellectual Property I, L.P. | Data session handoff using bone conduction |
US10007380B2 (en) | 2013-07-29 | 2018-06-26 | Hideep Inc. | Touch input device with edge support member |
US10045732B2 (en) | 2014-09-10 | 2018-08-14 | At&T Intellectual Property I, L.P. | Measuring muscle exertion using bone conduction |
US10108984B2 (en) | 2013-10-29 | 2018-10-23 | At&T Intellectual Property I, L.P. | Detecting body language via bone conduction |
US20190087035A1 (en) * | 2017-09-15 | 2019-03-21 | Stmicroelectronics Asia Pacific Pte Ltd | Partial mutual capacitive touch sensing in a touch sensitive device |
US10282024B2 (en) | 2014-09-25 | 2019-05-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
JP2019515372A (en) * | 2016-03-31 | 2019-06-06 | シナプティクス インコーポレイテッド | Combination of transformer capacity data and absolute capacity data for touch force estimation |
US10444921B2 (en) * | 2016-10-21 | 2019-10-15 | Salt International Corp. | Capacitive sensing device and detection method for an irregular conductive matter in a touch event |
US10474271B2 (en) | 2014-08-01 | 2019-11-12 | Hideep Inc. | Touch input device |
US10599251B2 (en) | 2014-09-11 | 2020-03-24 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
US10642407B2 (en) | 2011-10-18 | 2020-05-05 | Carnegie Mellon University | Method and apparatus for classifying touch events on a touch sensitive surface |
US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
US10678322B2 (en) | 2013-11-18 | 2020-06-09 | At&T Intellectual Property I, L.P. | Pressure sensing via bone conduction |
CN111366273A (en) * | 2020-03-04 | 2020-07-03 | 中国科学院苏州纳米技术与纳米仿生研究所 | Attachable vertical microcapacitive flexible mechanical sensor and its manufacturing method and application |
US10831316B2 (en) | 2018-07-26 | 2020-11-10 | At&T Intellectual Property I, L.P. | Surface interface |
WO2021005327A1 (en) * | 2019-07-09 | 2021-01-14 | Cambridge Touch Technologies Ltd | Force signal processing |
US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
US10949029B2 (en) | 2013-03-25 | 2021-03-16 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers |
WO2021076123A1 (en) * | 2019-10-16 | 2021-04-22 | Google Llc | Capacitive sensor latency compensation |
US11009989B2 (en) | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
US11023065B2 (en) | 2013-07-29 | 2021-06-01 | Hideep Inc. | Touch sensor |
TWI735117B (en) * | 2018-12-27 | 2021-08-01 | 聯詠科技股份有限公司 | Electronic device and fingerprint sensing control method thereof |
US11175698B2 (en) | 2013-03-19 | 2021-11-16 | Qeexo, Co. | Methods and systems for processing touch inputs based on touch type and touch intensity |
US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
US11262864B2 (en) | 2013-03-25 | 2022-03-01 | Qeexo, Co. | Method and apparatus for classifying finger touch events |
US11321958B2 (en) | 2018-12-27 | 2022-05-03 | Novatek Microelectronics Corp. | Electronic device and fingerprint sensing control method thereof |
US11561636B2 (en) * | 2016-11-24 | 2023-01-24 | Hideep Inc. | Touch input device for detecting pressure with display noise compensation |
US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
DE102016125229B4 (en) | 2016-03-02 | 2023-03-23 | Google LLC (n.d.Ges.d. Staates Delaware) | Force measurement with capacitive touch surfaces |
US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
US11635839B2 (en) | 2016-03-25 | 2023-04-25 | Sensel Inc. | System and method for detecting and characterizing force inputs on a surface |
WO2023219259A1 (en) * | 2022-05-12 | 2023-11-16 | 엘지전자 주식회사 | Image display device and operating method therefor |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016073429A1 (en) * | 2014-11-04 | 2016-05-12 | Dow Agrosciences Llc | Pest control system and method of operating same |
KR102520639B1 (en) | 2018-05-02 | 2023-04-11 | 삼성디스플레이 주식회사 | Touch sensing device and display device including the same |
KR102700035B1 (en) | 2019-02-19 | 2024-08-29 | 삼성전자주식회사 | Electronic device for identifying coordinate of external object touching touch sensor |
CN111694440B (en) | 2019-03-13 | 2025-05-16 | 密克罗奇普技术公司 | Keyboard for secure data entry |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100253651A1 (en) * | 2009-04-06 | 2010-10-07 | Synaptics Incorporated | Input device with deflectable electrode |
US20110025629A1 (en) * | 2009-07-28 | 2011-02-03 | Cypress Semiconductor Corporation | Dynamic Mode Switching for Fast Touch Response |
US20120050211A1 (en) * | 2010-08-27 | 2012-03-01 | Brian Michael King | Concurrent signal detection for touch and hover sensing |
US20120075243A1 (en) * | 2010-09-24 | 2012-03-29 | Koji Doi | Display Device |
US20120079434A1 (en) * | 2009-05-04 | 2012-03-29 | Jin-He Jung | Device and method for producing three-dimensional content for portable devices |
US20120105358A1 (en) * | 2010-11-03 | 2012-05-03 | Qualcomm Incorporated | Force sensing touch screen |
US20120147052A1 (en) * | 2009-09-02 | 2012-06-14 | Fuminori Homma | Operation control device, operation control method and computer program |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1997018528A1 (en) | 1995-11-13 | 1997-05-22 | Synaptics, Inc. | Stylus input capacitive touchpad sensor |
US7746325B2 (en) | 2002-05-06 | 2010-06-29 | 3M Innovative Properties Company | Method for improving positioned accuracy for a determined touch input |
US7538760B2 (en) | 2006-03-30 | 2009-05-26 | Apple Inc. | Force imaging input device and system |
US8063886B2 (en) | 2006-07-18 | 2011-11-22 | Iee International Electronics & Engineering S.A. | Data input device |
US7460441B2 (en) | 2007-01-12 | 2008-12-02 | Microchip Technology Incorporated | Measuring a long time period |
US7916126B2 (en) | 2007-06-13 | 2011-03-29 | Apple Inc. | Bottom-up watershed dataflow method and region-specific segmentation based on historic data to identify patches on a touch sensor panel |
US20090174675A1 (en) | 2008-01-09 | 2009-07-09 | Dave Gillespie | Locating multiple objects on a capacitive touch pad |
US8816986B1 (en) | 2008-06-01 | 2014-08-26 | Cypress Semiconductor Corporation | Multiple touch detection |
US7764213B2 (en) | 2008-07-01 | 2010-07-27 | Microchip Technology Incorporated | Current-time digital-to-analog converter |
EP2327007A4 (en) | 2008-08-01 | 2012-12-26 | 3M Innovative Properties Co | Touch sensitive devices with composite electrodes |
US8836350B2 (en) | 2009-01-16 | 2014-09-16 | Microchip Technology Incorporated | Capacitive touch sensing using an internal capacitor of an analog-to-digital converter (ADC) and a voltage reference |
JP5193942B2 (en) | 2009-05-14 | 2013-05-08 | 京セラディスプレイ株式会社 | Capacitive touch panel device |
US8614681B2 (en) | 2009-06-12 | 2013-12-24 | Cirque Corporation | Multitouch input to touchpad derived from positive slope detection data |
TW201122974A (en) | 2009-12-24 | 2011-07-01 | Ili Technology Corp | Touch detection device and method thereof. |
CN101840293B (en) | 2010-01-21 | 2012-03-21 | 宸鸿科技(厦门)有限公司 | Scanning method for projected capacitive touch panels |
US9948297B2 (en) | 2010-04-14 | 2018-04-17 | Frederick Johannes Bruwer | Pressure dependent capacitive sensing circuit switch construction |
US8933907B2 (en) | 2010-04-30 | 2015-01-13 | Microchip Technology Incorporated | Capacitive touch system using both self and mutual capacitance |
US8692795B1 (en) | 2010-08-24 | 2014-04-08 | Cypress Semiconductor Corporation | Contact identification and tracking on a capacitance sensing array |
JP5496851B2 (en) | 2010-10-22 | 2014-05-21 | 株式会社ジャパンディスプレイ | Touch panel |
TWI516994B (en) | 2011-03-15 | 2016-01-11 | 晨星半導體股份有限公司 | Method and associated apparatus for multi-touch control |
US8872804B2 (en) | 2011-07-21 | 2014-10-28 | Qualcomm Mems Technologies, Inc. | Touch sensing display devices and related methods |
US10222912B2 (en) | 2011-09-06 | 2019-03-05 | Atmel Corporation | Touch sensor with touch object discrimination |
US9748952B2 (en) | 2011-09-21 | 2017-08-29 | Synaptics Incorporated | Input device with integrated deformable electrode structure for force sensing |
US9317161B2 (en) | 2011-11-22 | 2016-04-19 | Atmel Corporation | Touch sensor with spacers supporting a cover panel |
TWI447632B (en) | 2012-03-09 | 2014-08-01 | Orise Technology Co Ltd | Driving frequency selection method for capacitive multi-touch system |
US9207820B2 (en) | 2012-03-30 | 2015-12-08 | Microchip Technology Incorporated | Method and system for multi-touch decoding |
TWI490760B (en) | 2012-04-03 | 2015-07-01 | Elan Microelectronics Corp | A method and an apparatus for improving noise interference of a capacitive touch device |
US8976151B2 (en) | 2012-09-14 | 2015-03-10 | Stmicroelectronics Asia Pacific Pte Ltd | Configurable analog front-end for mutual capacitance sensing and self capacitance sensing |
US8982097B1 (en) | 2013-12-02 | 2015-03-17 | Cypress Semiconductor Corporation | Water rejection and wet finger tracking algorithms for truetouch panels and self capacitance touch sensors |
-
2014
- 2014-04-16 US US14/254,098 patent/US9430107B2/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100253651A1 (en) * | 2009-04-06 | 2010-10-07 | Synaptics Incorporated | Input device with deflectable electrode |
US20120079434A1 (en) * | 2009-05-04 | 2012-03-29 | Jin-He Jung | Device and method for producing three-dimensional content for portable devices |
US20110025629A1 (en) * | 2009-07-28 | 2011-02-03 | Cypress Semiconductor Corporation | Dynamic Mode Switching for Fast Touch Response |
US20120147052A1 (en) * | 2009-09-02 | 2012-06-14 | Fuminori Homma | Operation control device, operation control method and computer program |
US20120050211A1 (en) * | 2010-08-27 | 2012-03-01 | Brian Michael King | Concurrent signal detection for touch and hover sensing |
US20120075243A1 (en) * | 2010-09-24 | 2012-03-29 | Koji Doi | Display Device |
US20120105358A1 (en) * | 2010-11-03 | 2012-05-03 | Qualcomm Incorporated | Force sensing touch screen |
Cited By (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9430043B1 (en) | 2000-07-06 | 2016-08-30 | At&T Intellectual Property Ii, L.P. | Bioacoustic control system, method and apparatus |
US10126828B2 (en) | 2000-07-06 | 2018-11-13 | At&T Intellectual Property Ii, L.P. | Bioacoustic control system, method and apparatus |
US9612265B1 (en) | 2011-09-23 | 2017-04-04 | Cypress Semiconductor Corporation | Methods and apparatus to detect a conductive object |
US10642407B2 (en) | 2011-10-18 | 2020-05-05 | Carnegie Mellon University | Method and apparatus for classifying touch events on a touch sensitive surface |
US8908894B2 (en) | 2011-12-01 | 2014-12-09 | At&T Intellectual Property I, L.P. | Devices and methods for transferring data through a human body |
US9712929B2 (en) | 2011-12-01 | 2017-07-18 | At&T Intellectual Property I, L.P. | Devices and methods for transferring data through a human body |
US20150355788A1 (en) * | 2013-03-01 | 2015-12-10 | Lenovo (Beijing) Co., Ltd. | Method and electronic device for information processing |
US20140267061A1 (en) * | 2013-03-12 | 2014-09-18 | Synaptics Incorporated | System and method for pre-touch gestures in sensor devices |
US11175698B2 (en) | 2013-03-19 | 2021-11-16 | Qeexo, Co. | Methods and systems for processing touch inputs based on touch type and touch intensity |
US11262864B2 (en) | 2013-03-25 | 2022-03-01 | Qeexo, Co. | Method and apparatus for classifying finger touch events |
US10949029B2 (en) | 2013-03-25 | 2021-03-16 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers |
US10007380B2 (en) | 2013-07-29 | 2018-06-26 | Hideep Inc. | Touch input device with edge support member |
US11023065B2 (en) | 2013-07-29 | 2021-06-01 | Hideep Inc. | Touch sensor |
US20150193047A1 (en) * | 2013-09-10 | 2015-07-09 | Cypress Semiconductor Corporation | Interleaving sense elements of a capacitive-sense array |
US9563318B2 (en) * | 2013-09-10 | 2017-02-07 | Monterey Research, Llc | Interleaving conductive elements of a capacitive-sense array |
US10108984B2 (en) | 2013-10-29 | 2018-10-23 | At&T Intellectual Property I, L.P. | Detecting body language via bone conduction |
US10831282B2 (en) | 2013-11-05 | 2020-11-10 | At&T Intellectual Property I, L.P. | Gesture-based controls via bone conduction |
US10281991B2 (en) | 2013-11-05 | 2019-05-07 | At&T Intellectual Property I, L.P. | Gesture-based controls via bone conduction |
US9594433B2 (en) | 2013-11-05 | 2017-03-14 | At&T Intellectual Property I, L.P. | Gesture-based controls via bone conduction |
US10678322B2 (en) | 2013-11-18 | 2020-06-09 | At&T Intellectual Property I, L.P. | Pressure sensing via bone conduction |
US9349280B2 (en) | 2013-11-18 | 2016-05-24 | At&T Intellectual Property I, L.P. | Disrupting bone conduction signals |
US9997060B2 (en) | 2013-11-18 | 2018-06-12 | At&T Intellectual Property I, L.P. | Disrupting bone conduction signals |
US10497253B2 (en) | 2013-11-18 | 2019-12-03 | At&T Intellectual Property I, L.P. | Disrupting bone conduction signals |
US10964204B2 (en) | 2013-11-18 | 2021-03-30 | At&T Intellectual Property I, L.P. | Disrupting bone conduction signals |
US9972145B2 (en) | 2013-11-19 | 2018-05-15 | At&T Intellectual Property I, L.P. | Authenticating a user on behalf of another user based upon a unique body signature determined through bone conduction signals |
US9715774B2 (en) | 2013-11-19 | 2017-07-25 | At&T Intellectual Property I, L.P. | Authenticating a user on behalf of another user based upon a unique body signature determined through bone conduction signals |
US9405892B2 (en) | 2013-11-26 | 2016-08-02 | At&T Intellectual Property I, L.P. | Preventing spoofing attacks for bone conduction applications |
US9736180B2 (en) | 2013-11-26 | 2017-08-15 | At&T Intellectual Property I, L.P. | Preventing spoofing attacks for bone conduction applications |
US9547388B2 (en) | 2014-08-01 | 2017-01-17 | Hideep Inc. | Touch input device |
US12086370B2 (en) | 2014-08-01 | 2024-09-10 | Hideep Inc. | Touch input device |
US10133377B2 (en) | 2014-08-01 | 2018-11-20 | Hideep Inc. | Smartphone |
US11709573B2 (en) | 2014-08-01 | 2023-07-25 | Hideep Inc. | Touch input device |
US11301103B2 (en) | 2014-08-01 | 2022-04-12 | Hideep Inc. | Touch input device |
US10474271B2 (en) | 2014-08-01 | 2019-11-12 | Hideep Inc. | Touch input device |
US10007371B2 (en) | 2014-08-01 | 2018-06-26 | Hideep Inc. | Smartphone |
US10983648B2 (en) | 2014-08-01 | 2021-04-20 | Hideep Inc. | Touch input device |
US9454253B2 (en) * | 2014-08-01 | 2016-09-27 | Hideep Inc. | Smartphone |
US10276003B2 (en) | 2014-09-10 | 2019-04-30 | At&T Intellectual Property I, L.P. | Bone conduction tags |
US11096622B2 (en) | 2014-09-10 | 2021-08-24 | At&T Intellectual Property I, L.P. | Measuring muscle exertion using bone conduction |
US9582071B2 (en) | 2014-09-10 | 2017-02-28 | At&T Intellectual Property I, L.P. | Device hold determination using bone conduction |
US10045732B2 (en) | 2014-09-10 | 2018-08-14 | At&T Intellectual Property I, L.P. | Measuring muscle exertion using bone conduction |
US9589482B2 (en) | 2014-09-10 | 2017-03-07 | At&T Intellectual Property I, L.P. | Bone conduction tags |
US9882992B2 (en) | 2014-09-10 | 2018-01-30 | At&T Intellectual Property I, L.P. | Data session handoff using bone conduction |
US10599251B2 (en) | 2014-09-11 | 2020-03-24 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
US10452185B2 (en) | 2014-09-19 | 2019-10-22 | Hideep Inc. | Smartphone |
US9575586B2 (en) | 2014-09-19 | 2017-02-21 | Hideep Inc. | Touch input device |
US9535529B2 (en) | 2014-09-19 | 2017-01-03 | Hideep Inc. | Smartphone |
US9578148B2 (en) | 2014-09-19 | 2017-02-21 | Hideep Inc. | Smartphone capable of detecting touch position and pressure |
US9658712B2 (en) | 2014-09-19 | 2017-05-23 | Hideep Inc. | Smartphone |
US9619068B2 (en) | 2014-09-19 | 2017-04-11 | Hideep Inc. | Smartphone |
US11182000B2 (en) | 2014-09-19 | 2021-11-23 | Hideep Inc. | Smartphone |
US9804703B2 (en) | 2014-09-19 | 2017-10-31 | Hideep Inc. | Touch input device which detects a magnitude of a touch pressure |
US20160085324A1 (en) * | 2014-09-24 | 2016-03-24 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
US11029785B2 (en) | 2014-09-24 | 2021-06-08 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
CN107077242A (en) * | 2014-09-24 | 2017-08-18 | 齐科斯欧公司 | Method for Improving the Accuracy of Touch Screen Event Analysis by Using Spatiotemporal Touch Patterns |
US10606417B2 (en) * | 2014-09-24 | 2020-03-31 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
US10282024B2 (en) | 2014-09-25 | 2019-05-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
WO2016058342A1 (en) * | 2014-10-15 | 2016-04-21 | 京东方科技集团股份有限公司 | Display device and drive method therefor |
US10082900B2 (en) | 2014-10-15 | 2018-09-25 | Boe Technology Group Co., Ltd. | Display device and method for driving the same |
US9600079B2 (en) | 2014-10-15 | 2017-03-21 | At&T Intellectual Property I, L.P. | Surface determination via bone conduction |
US9486027B2 (en) | 2014-10-17 | 2016-11-08 | Guardhat, Inc. | Connection assembly for adjoining a peripheral with a host wearable device |
WO2016130951A1 (en) * | 2015-02-12 | 2016-08-18 | Qualcomm Technologies, Inc. | Integrated touch and force detection |
WO2016138535A1 (en) * | 2015-02-27 | 2016-09-01 | Tactual Labs Co. | Multitouch frame matching with distance fields |
US10606402B2 (en) | 2015-07-27 | 2020-03-31 | Hideep Inc. | Smartphone |
US11003006B2 (en) | 2015-07-27 | 2021-05-11 | Hideep Inc. | Touch input device |
US10234984B2 (en) | 2015-07-27 | 2019-03-19 | Hideep Inc. | Backlight module with integrated pressure sensor |
US9501195B1 (en) | 2015-07-27 | 2016-11-22 | Hideep Inc. | Smartphone |
CN105117055A (en) * | 2015-08-14 | 2015-12-02 | 宸鸿科技(厦门)有限公司 | Touch pressed type three-dimensional signal input device, application method and multi-functional touch control panel |
US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
EP3136210A1 (en) * | 2015-08-31 | 2017-03-01 | HiDeep Inc. | Pressure detector capable of adjusting pressure sensitivity and touch input device including the same |
CN106484176A (en) * | 2015-08-31 | 2017-03-08 | 希迪普公司 | Pressure detector and the touch input device containing which of pressure-sensitivity can be adjusted |
CN106980423A (en) * | 2016-01-19 | 2017-07-25 | 瑞鼎科技股份有限公司 | Self-capacitance touch and pressure sensing device and self-capacitance touch and pressure sensing method |
DE102016125229B4 (en) | 2016-03-02 | 2023-03-23 | Google LLC (n.d.Ges.d. Staates Delaware) | Force measurement with capacitive touch surfaces |
US11635839B2 (en) | 2016-03-25 | 2023-04-25 | Sensel Inc. | System and method for detecting and characterizing force inputs on a surface |
JP2019515372A (en) * | 2016-03-31 | 2019-06-06 | シナプティクス インコーポレイテッド | Combination of transformer capacity data and absolute capacity data for touch force estimation |
JP7112961B2 (en) | 2016-03-31 | 2022-08-04 | シナプティクス インコーポレイテッド | Combining Transformer Capacitance Data and Absolute Capacitance Data for Touch Force Estimation |
US10488996B2 (en) | 2016-05-18 | 2019-11-26 | Sensel, Inc. | System for detecting and confirming a touch input |
WO2017201338A1 (en) * | 2016-05-18 | 2017-11-23 | Sensel Inc. | Method for detecting and confirming a touch input |
JP2019517076A (en) * | 2016-05-18 | 2019-06-20 | センセル インコーポレイテッドSensel,Inc. | How to detect and verify touch input |
US10444921B2 (en) * | 2016-10-21 | 2019-10-15 | Salt International Corp. | Capacitive sensing device and detection method for an irregular conductive matter in a touch event |
US11561636B2 (en) * | 2016-11-24 | 2023-01-24 | Hideep Inc. | Touch input device for detecting pressure with display noise compensation |
US20190087035A1 (en) * | 2017-09-15 | 2019-03-21 | Stmicroelectronics Asia Pacific Pte Ltd | Partial mutual capacitive touch sensing in a touch sensitive device |
US10996792B2 (en) * | 2017-09-15 | 2021-05-04 | Stmicroelectronics Asia Pacific Pte Ltd | Partial mutual capacitive touch sensing in a touch sensitive device |
KR20180009369A (en) * | 2018-01-10 | 2018-01-26 | 주식회사 하이딥 | Pressure detector capable of pressure sensitivity adjustment and touch input depvice including the same |
KR101939196B1 (en) * | 2018-01-10 | 2019-01-17 | 주식회사 하이딥 | Pressure detector capable of pressure sensitivity adjustment and touch input depvice including the same |
US10831316B2 (en) | 2018-07-26 | 2020-11-10 | At&T Intellectual Property I, L.P. | Surface interface |
US11009989B2 (en) | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
TWI735117B (en) * | 2018-12-27 | 2021-08-01 | 聯詠科技股份有限公司 | Electronic device and fingerprint sensing control method thereof |
US11321958B2 (en) | 2018-12-27 | 2022-05-03 | Novatek Microelectronics Corp. | Electronic device and fingerprint sensing control method thereof |
US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
US11543922B2 (en) | 2019-06-28 | 2023-01-03 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
WO2021005327A1 (en) * | 2019-07-09 | 2021-01-14 | Cambridge Touch Technologies Ltd | Force signal processing |
US11249575B2 (en) | 2019-07-09 | 2022-02-15 | Cambridge Touch Technologies Ltd. | Force signal processing |
US11693504B2 (en) | 2019-07-09 | 2023-07-04 | Cambridge Touch Technologies Ltd. | Force signal processing |
WO2021076123A1 (en) * | 2019-10-16 | 2021-04-22 | Google Llc | Capacitive sensor latency compensation |
US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
US12163923B2 (en) | 2020-01-29 | 2024-12-10 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
CN111366273A (en) * | 2020-03-04 | 2020-07-03 | 中国科学院苏州纳米技术与纳米仿生研究所 | Attachable vertical microcapacitive flexible mechanical sensor and its manufacturing method and application |
WO2023219259A1 (en) * | 2022-05-12 | 2023-11-16 | 엘지전자 주식회사 | Image display device and operating method therefor |
Also Published As
Publication number | Publication date |
---|---|
US9430107B2 (en) | 2016-08-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9430107B2 (en) | Determining touch locations and forces thereto on a touch and force sensing surface | |
US9207820B2 (en) | Method and system for multi-touch decoding | |
EP3132330B1 (en) | Determining touch locations and forces thereto on a touch and force sensing surface | |
US9904417B2 (en) | Projected capacitive touch detection with touch force detection using self-capacitance and mutual capacitance detection | |
KR101453347B1 (en) | Touch detecting method and apparatus for decreasing noise | |
TWI614647B (en) | Force sensing x-y touch sensor | |
US8659557B2 (en) | Touch finding method and apparatus | |
TWI496041B (en) | Two-dimensional touch sensors | |
EP2159673A2 (en) | A multi-point touch-sensitive system | |
US20150103043A1 (en) | Hover Position Calculation in a Touchscreen Device | |
EP2159672A2 (en) | Method of operating a multi-point touch-sensitive system | |
US20160005352A1 (en) | Touch sensing device | |
CN102135829A (en) | System and method for driving touch panel | |
US20100141604A1 (en) | Resistive multi touch screen | |
US20100328233A1 (en) | Touch panel with unbalanced conductive patterns, and touch-controlled apparatus and method for determining multi-touch thereof | |
CN109669585B (en) | Capacitive touch sensing that can determine conductivity type | |
US20120056842A1 (en) | Sensing Apparatus for Touch Panel and Sensing Method Thereof | |
US9507454B1 (en) | Enhanced linearity of gestures on a touch-sensitive surface | |
US11243636B1 (en) | Rollable display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROCHIP TECHNOLOGY INCORPORATED, ARIZONA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HANAUER, JERRY;LAMONT, LANCE;CURTIS, KEITH E.;SIGNING DATES FROM 20131205 TO 20140123;REEL/FRAME:032718/0100 |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT, ILLINOIS Free format text: SECURITY INTEREST;ASSIGNOR:MICROCHIP TECHNOLOGY INCORPORATED;REEL/FRAME:041675/0617 Effective date: 20170208 Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT Free format text: SECURITY INTEREST;ASSIGNOR:MICROCHIP TECHNOLOGY INCORPORATED;REEL/FRAME:041675/0617 Effective date: 20170208 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT, ILLINOIS Free format text: SECURITY INTEREST;ASSIGNORS:MICROCHIP TECHNOLOGY INCORPORATED;SILICON STORAGE TECHNOLOGY, INC.;ATMEL CORPORATION;AND OTHERS;REEL/FRAME:046426/0001 Effective date: 20180529 Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT Free format text: SECURITY INTEREST;ASSIGNORS:MICROCHIP TECHNOLOGY INCORPORATED;SILICON STORAGE TECHNOLOGY, INC.;ATMEL CORPORATION;AND OTHERS;REEL/FRAME:046426/0001 Effective date: 20180529 |
|
AS | Assignment |
Owner name: WELLS FARGO BANK, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT, CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNORS:MICROCHIP TECHNOLOGY INCORPORATED;SILICON STORAGE TECHNOLOGY, INC.;ATMEL CORPORATION;AND OTHERS;REEL/FRAME:047103/0206 Effective date: 20180914 Owner name: WELLS FARGO BANK, NATIONAL ASSOCIATION, AS NOTES C Free format text: SECURITY INTEREST;ASSIGNORS:MICROCHIP TECHNOLOGY INCORPORATED;SILICON STORAGE TECHNOLOGY, INC.;ATMEL CORPORATION;AND OTHERS;REEL/FRAME:047103/0206 Effective date: 20180914 |
|
AS | Assignment |
Owner name: MICROCHIP TECHNOLOGY INCORPORATED, ARIZONA Free format text: RELEASE OF SECURITY INTEREST IN CERTAIN PATENT RIGHTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:047976/0884 Effective date: 20181221 Owner name: ATMEL CORPORATION, ARIZONA Free format text: RELEASE OF SECURITY INTEREST IN CERTAIN PATENT RIGHTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:047976/0884 Effective date: 20181221 Owner name: ATMEL CORPORATION, ARIZONA Free format text: RELEASE OF SECURITY INTEREST IN CERTAIN PATENT RIGHTS;ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:047976/0937 Effective date: 20181221 Owner name: MICROCHIP TECHNOLOGY INCORPORATED, ARIZONA Free format text: RELEASE OF SECURITY INTEREST IN CERTAIN PATENT RIGHTS;ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:047976/0937 Effective date: 20181221 |
|
AS | Assignment |
Owner name: NEODRON LIMITED, IRELAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MICROCHIP TECHNOLOGY INC.;ATMEL CORPORATION;MICROCHIP TECHNOLOGY GERMANY GMBH;REEL/FRAME:048259/0840 Effective date: 20181221 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: MICROSEMI STORAGE SOLUTIONS, INC., ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:059333/0222 Effective date: 20220218 Owner name: MICROSEMI CORPORATION, ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:059333/0222 Effective date: 20220218 Owner name: ATMEL CORPORATION, ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:059333/0222 Effective date: 20220218 Owner name: SILICON STORAGE TECHNOLOGY, INC., ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:059333/0222 Effective date: 20220218 Owner name: MICROCHIP TECHNOLOGY INCORPORATED, ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:059333/0222 Effective date: 20220218 |
|
AS | Assignment |
Owner name: MICROCHIP TECHNOLOGY INCORPORATED, ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:059666/0545 Effective date: 20220218 |
|
AS | Assignment |
Owner name: MICROSEMI STORAGE SOLUTIONS, INC., ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:059358/0001 Effective date: 20220228 Owner name: MICROSEMI CORPORATION, ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:059358/0001 Effective date: 20220228 Owner name: ATMEL CORPORATION, ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:059358/0001 Effective date: 20220228 Owner name: SILICON STORAGE TECHNOLOGY, INC., ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:059358/0001 Effective date: 20220228 Owner name: MICROCHIP TECHNOLOGY INCORPORATED, ARIZONA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:059358/0001 Effective date: 20220228 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240830 |