MobileTouchscreen Support in Qualcomm Brew

Touchscreen Support in Qualcomm Brew

Developer.com content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Although the Apple iPhone was not the first touchscreen cell phone on the market (think back to Windows Mobile and Treo devices), it’s certainly caused quite a fuss for carriers and manufacturers, all of whom have been in quite a rush to launch their own touchscreen devices. Many of these devices run Qualcomm Brew, and it’s perhaps not surprising that Brew has had support for touchscreens for a long time—since BREW 2.x, in fact. In this article, I show you how to receive touchscreen events so you can add touchscreen support to your own applications. Everything I describe works in the simulator, so you can do your development normally, test in the simulator, and as you receive hardware for testing, be ready to deploy your application.

The IControl Interface and Touchscreen Events

The good news is that if you’re one of the relatively few developers who are happy using Brew’s IControl interface hierarchy, you get touchscreen support essentially for free. Text controls and menus accept touch input, so all you need to do is handle control selection by touch.

To make sure your IControl-based application is ready for touch support, you need to do two things:

  1. Ensure your application passes touch events to the controls.
  2. Be sure that when the user touches outside a control and on another control that you transfer focus to the touched control.

The first is relatively easy: Just be sure you pass all events to the underlying active control using ICONTROL_HandleEvent. Generally, it’s best to do this at the beginning of your event handler, like this:

static boolean HandleEvent( CApp *pMe, AEEEvent ec, uint16 w,
   uint32 dw ) { boolean result = FALSE;

   if ( pMe->pIActiveCtl ) {
      result = ICONTROL_HandleEvent( pMe->pIActiveCtl, ec, w, dw );
   }

   if ( !result ) switch( ec ) {
      // Handle event cases here.
   }
   return result;
}

The second is easy, too, as long as you can do hit testing to determine where the user touched the screen. Once you do this—the subject of the next section—you should defocus the current control and give focus to the new control, doing something like this:

if ( pMe->pIActiveCtl ) {
   ICONTROL_SetActive( pMe->pIActiveCtl, FALSE );
}
ICONTROL_SetActive( pINewFocusedCtl, TRUE );
pMe->pIActiveCtl = pINewFocusedCtl;

Accepting touchscreen Events

Under the hood, BREW defines the following events for touchscreens:

  • EVT_PEN_DOWN when the user touches the display
  • EVT_PEN_MOVE while the user moves along the display while touching the display
  • EVT_PEN_UP when the user stops touching the display
  • EVT_PEN_STALE_MOVE if the user has moved along the display while touching the display and there are newer EVT_PEN_MOVE events in the queue

Thus, the general sequence of events when the user touches the display is one of EVT_PEN_DOWN, EVT_PEN_UP (for quick taps) or EVT_PEN_DOWN, a series of EVT_PEN_MOVE (perhaps with one or more EVT_PEN_STALE_MOVE events). The coordinates (x and y) of the event are delivered to you as a set of packed integers in the uint32 that accompanies the event; you can get just the x- or y-coordinate from the packed integer using the BREW helper functions AEE_GET_X and AEE_GET_Y.

Consequently, hit-testing—detecting which item is touched—in your event handler is easy, especially if you keep a list of objects that can be touched by the user. This is especially important if your application is a game or another application that doesn’t use the BREW IControl hierarchy of controls. The basic algorithm is:

// On EVT_PEN_UP
int x0 = AEE_GET_X( dw );
int y0 = AEE_GET_Y( dw );
while( there are more objects in the list ) {
   int x = current object in list's x coordinate
   int y = current object in list's y coordinate
   int dx = current object in list's width
   int dy = current object in list's height
   if ( x0 > x && x0 < x + dx && y0 > y && y0 < y + dy ) {
      // the current object is one hit by the touch event
      // focus it, activate it, or what have you.
}

You want to do the hit testing on the EVT_PEN_UP event, not the EVT_PEN_DOWN event, because it gives the user a chance to cancel a selection by touching something, moving away from the item they touched while still touching the screen, and then releasing the screen. This is analogous to how buttons work in a desktop user interface: Try clicking down on the OK button of a dialog and then dragging the mouse away from the button; when you release the mouse, the OK action is only taken if the pointer is over the screen.

Handling EVT_PEN_MOVE is a different story; if you want to implement dragging, you must be prepared to do whatever additional work on each EVT_PEN_MOVE event that’s required. Whatever work your application performs here should take as little time as possible because your application can receive a lot of EVT_PEN_MOVE events.

A Bit of Advice…

When implementing touch support, you should remember that touch positions are approximate. This doesn’t so much reflect inaccuracies in the hardware as it reflects the inability of a user to precisely position their fingertip or a pen tip on exactly one spot of the screen. The user may be in motion (walking, riding a bus, or heaven forbid, driving), and both pointer and device will be in lateral motion, effectively adding noise to the input your application is trying to obtain.

To work with this, make your hit areas large. Some carriers may mandate hit area size on some phones, especially given the small dot pitch of today’s screens. It’s nearly impossible to hit a 16×16 rectangle when that’s only two-tenths of an inch on a side! Controls should be properly sized—a line of text may not be tall enough to be a single control, so consider using the extra screen real estate these devices have to provide big fat touchable areas for your users. Some carriers may mandate a minimum hit size (probably around forty pixels on a side, but you should check with your carrier partners!) for specific devices, and you may fail carrier certification if you don’t meet those requirements. This is especially true for devices directly competing with the iPhone, where users are expected to use their finger, not a stylus, to touch the display.

Conclusion

Adding touch support to your application isn’t hard, especially if you’re already leveraging Brew’s IControl hierarchy of interfaces. Just be sure to send the pen events to the active control, and add hit testing to your application to determine which control should be activated. If you’ve created your own controls, it’s a little more work: Be prepared to handle the trio of EVT_PEN_DOWN, EVT_PEN_MOVE, and EVT_PEN_UP events; most of your work is likely in handling EVT_PEN_UP, because that’s what confirms what your user wants to do. Regardless, it’s not difficult to be ready for the wave of touchscreen phones coming to market!

About the Author

Ray Rischpater is the chief architect at Rocket Mobile, Inc., specializing in the design and development of messaging and information access applications for today’s wireless devices. Ray Rischpater is the author of several books on software Development including eBay Application Development and Software Development for the QUALCOMM BREW Platform, both available from Apress, and is an active Amateur Radio operator. Contact Ray at kf6gpe@lothlorien.com.

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Latest Posts

Related Stories