Native iOS Touch Events w/ Rust

Wade Zimmerman - Jan 22 '23 - - Dev Community

A continuation of my journey exploring cross platform game development using purely Rust and the Bevy engine.

Seriously. It's 100% Rust!

Previous Article: Compiling to iOS


Criteria

Once my project compiles to iOS, the next logical step is to handle tapping/clicking. If all succeeds, the engine should produce some sort of coordinate value on a tap/click event. Ideally, game components should react to the event and produce some sort of visual feedback.


0. Create a grid of squares

This part of the program is not important right now because my main goal is to recognize input events. If you are following along you can either skip this step or make whatever you want.

For those who are curious, the core of the grid comes down to a sprite bundle which I is encapsulated as "TileBundle". For now, each tile has an arbitrary color and position.

sprite: SpriteBundle {
    sprite: Sprite {
        color: settings.color,
        ..default()
    },
    ..default()
},
position: settings.position,
..default()
Enter fullscreen mode Exit fullscreen mode

Rust Native iOS Grid


1. Make a System for Handling Pointer Events

Depending on the scenario, I may want to separate click events from tap events, but for now, I want the mobile and desktop environments to behave the same. So I will dispatch the same MyPointerEvent for both events.

However, knowing this could be a gross simplification, I made two separate systems. If I decide to make the event handling more complex at a later point, all I have to do is expand the definition of the struct to fit my needs.

.add_system(tap_capture_system)
.add_system(click_capture_system)
.add_event::<MyPointerEvent>()
Enter fullscreen mode Exit fullscreen mode
pub struct MyPointerEvent {
    pub position: Vec2,
}
Enter fullscreen mode Exit fullscreen mode

2. Obtain a window coordinate

I want my pointer events to hold a coordinate value. To obtain a coordinate I need to use the primary window or the window associated with a specific camera.

fn click_capture_system(
    windows: Res<Windows>,
    mut tap_event: EventWriter<PointerEvent>,
    q_camera: Query<(&Camera, &GlobalTransform), With<MainCamera>>
    // todo: add mouse button or tap
) {

    // assuming there is exactly one main camera entity, so query::single() is OK
    let (camera, camera_transform) = q_camera.single();

    // get the window that the camera is displaying to (or the primary window)
    let wnd = if let RenderTarget::Window(id) = camera.target       
    {
        windows.get(id).unwrap()
    } else {
        windows.get_primary().unwrap()
    };
}
Enter fullscreen mode Exit fullscreen mode

3. Dispatch shared event

Now I need to listen for click events or pointer events on the window. When an event happens in either scenario, I will dispatch the same event.

fn click_capture_system(
    // ...
    btn: Res<Input<MouseButton>>,
) {
    // check if the cursor is inside the window and get its position
    if let Some(screen_pos) = wnd.cursor_position() {
        if btn.just_released(MouseButton::Left) {
            debug!("hello click {}", screen_pos);
            tap_event.send(PointerEvent {
                position: screen_pos
            });
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Touches are handled slightly different because the position is not optional. Pretty sure the difference is attributed to the fact that mobile devices can only detect taps when the application is open. That's a whole can of worms I don't really care about right now.

fn tap_capture_system(
    // ...
    touches: Res<Touches>,
) {
   for touch in touches.iter_just_released() {
        if touches.just_released(touch.id()) {
            debug!("hello tap {}", touch.position());
            tap_event.send(PointerEvent {
                position: touch.position()
            });
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Rust native tap event logged to console


4. Listening to custom event

Now I want to consume my custom event data elsewhere in my application. So I for now I created a separate sub system for handling pointer events which lives within the plugin I use for my TileBundle code.

fn handle_tile_pointer_events(
    mut events: EventReader<PointerEvent>,
) {
    for pointer_event in events.iter() {
       // do something
    }
}
Enter fullscreen mode Exit fullscreen mode

5. Convert Screen Coordinates into Game World Coordinates

This took me a while to conceptualize the first time so if give yourself some time to understand what is going on.

There is a problem with my existing code. There are multiple coordinate systems to account for. The window/mobile device have a 2D coordinate system and the game has a 3D/2D coordinate system.

NDC Ray Casting Rust

I need to convert the operating system's coordinate system into a coordinate that makes since for my game. This is done by ray casting, aka normalizing coordinates. I like to think of it as mapping the game world onto a flat surface.

Here is the code I'm using for the mouse system and the tap system. I replaced the raw touch/click position in previous steps with the world position produced below.

// get the size of the window
let window_size = Vec2::new(wnd.width() as f32, wnd.height() as f32);

// convert screen position [0..resolution] to ndc [-1..1] (gpu coordinates)
let ndc = (INSERT_SCREEN_POSITION_HERE / window_size) * 2.0 - Vec2::ONE;

// matrix for undoing the projection and camera transform
let ndc_to_world = camera_transform.compute_matrix() * camera.projection_matrix().inverse();

// use it to convert ndc to world-space coordinates
let world_pos = ndc_to_world.project_point3(ndc.extend(-1.0));

// reduce it to a 2D value
let world_pos: Vec2 = world_pos.truncate();
Enter fullscreen mode Exit fullscreen mode

6. Modify the world position for iOS.

Not sure if this is something that will change in the future, but currently Bevy produces an upside down coordinate. To fix this problem, I flipped the world position above on the Y axis.

let world_pos: Vec2 = world_pos.truncate()
    // flip y axis so touches line up with screen
    * Vec2::new(1.0, -1.0);
Enter fullscreen mode Exit fullscreen mode

7. Consume reusable pointer event

I used the event to change the tapped/clicked tile to a random color.

I had to manually import nalgebra and parry2d to make this work.

For now, I calculate the box collider on each event. Ideally these coordinates would be held by the tile bundle. All I'm doing is seeing if the click/tap overlaps with a tile sprite.

for e in events.iter() {
    for (mut tile, mut sprite, global, transform) in q.iter_mut() {
        tile.update();

        let pointer = point!(e.position.x, e.position.y);

        let pos = global.translation();

        let size = transform.scale;
        let bl = pos - (size / 2.0);
        let tr = pos + (size / 2.0);

        let square = [
            point!(bl.x, tr.y),
            point!(tr.x, tr.y),
            point!(tr.x, bl.y),
            point!(bl.x, bl.y),
        ];

        if point_in_poly2d(&pointer, &square) {
            sprite.color = random_color();
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Conclusion

It is possible to use Rust to handle tap events in iOS.

Here is what the final program looks like.

Color changing iOS Grid on Tap

To Be Continued

Please consider leaving a like and comment below. It helps me plan the next article.

Plus commenting what you're working on may inspire or help others. Let's make Rust iOS development a thing!

. . . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player