- Haptic Feedback during Doom Gameplay
This is a quick-and-dirty proof and concept that I wrote to give me some haptic feedback when I play FPS games. For this experiment I used Doom Eternal. When you take damage in Doom Eternal, the corners of the screen turn red:
I thought it’d be fun to have some sort of physical feedback when I took damage during gameplay. The concept is simple - use computer vision to watch a corner of the screen. If the red in that corner exceeds a certain handler, send … a small electric shock? Something that thumps me in the chest? I used libraries (Pillow, mss, and OpenCV) to capture the corners and analyze the color distribution. If the red exceeded a threshold, then it activated my haptic unit. In this case, the haptic unit was a Lego Mindstorms EV3. I connected to its serial port and sent data to a servo motor to buzz me each time that threshold was exceeded. So, when I got shot by demons in Doom, I’d get a little buzz from the motor.
My capture framerate was terrible (<25 FPS), and when scenes are predominantly red it just doesn’t work. To recreate, you can set the bbox
variable to a corner of the screen. You’ll need to play with that and the full_bbox
variable - and possibly the color (HSV_LOWER
) to get it to work. There are a lot of better ways to do this, but this was something I put together in few hours just to satisfy my curiosity.
"""
A PoC that uses CV to analyze Doom Eternal gameplay and provide
haptic feedback to an external device
- Creates a preview window
- Captures a portion of the screen
- Analzyes captured corner for "red"
- If >= threshold, sends message to hardware device
"""
import sys, socket
import numpy as np
import cv2
from mss import mss
from PIL import Image
roi = mss()
full_pane = mss()
bbox = {'top':155, 'left':20, 'width':50, 'height':50}
full_bbox = {'top':155, 'left':20, 'width':960, 'height':540}
window_name = 'Preview'
MIN_HIT_THRESHOLD = 3000 # How many pixels fall within the HSV range in our ROI
FRAME_DELAY = 60 # How many frames should we wait before activating the haptic feedback again
HSV_LOWER = np.array([0,138,96]) # HSV value of the color you want to look for (this is hex F50000)
HSV_UPPER = np.array([255,255,255])
HAPTIC_DEVICE = '192.168.86.22' # The haptic device's hostname or IP address
HAPTIC_PORT = 5009 # The port used by the device
def analyze_image(image, original):
"""
Take the image and get number of pixels
"""
image_copy = image.copy()
image = cv2.cvtColor(image, cv2.COLOR_BGR2HSV)
mask = cv2.inRange(image, HSV_LOWER, HSV_UPPER)
result = cv2.bitwise_and(image_copy, image_copy, mask=mask)
num_red_pixels = np.count_nonzero(result)
# Show the HSV version of the mask
original_copy = original.copy()
original = cv2.cvtColor(original, cv2.COLOR_BGR2HSV)
mask = cv2.inRange(original, HSV_LOWER, HSV_UPPER)
result = cv2.bitwise_and(original_copy, original_copy, mask=mask)
cv2.imshow('result', result)
cv2.moveWindow('result', 980, 10)
if num_red_pixels >= MIN_HIT_THRESHOLD:
# Hit!
return True
else:
return False
def send_haptic_signal():
"""
Sends a signal to a haptic device for feedback
"""
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
s.connect((HAPTIC_DEVICE, HAPTIC_PORT))
s.sendall(b'HIT')
data = s.recv(1024)
def main():
frames = 0
last_haptic_frame = 0
while True:
frames += 1
roi.get_pixels(bbox)
full_pane.get_pixels(full_bbox)
roi_img = Image.frombytes('RGB', (roi.width, roi.height), roi.image)
roi_img = cv2.cvtColor(np.array(roi_img), cv2.COLOR_RGB2BGR)
full_image = Image.frombytes('RGB', (full_pane.width, full_pane.height), full_pane.image)
full_image = cv2.cvtColor(np.array(full_image), cv2.COLOR_RGB2BGR)
is_hit = analyze_image(roi_img, full_image)
if is_hit:
last_haptic_frame = frames
# Activate haptics
send_haptic_signal()
if cv2.waitKey(1) & 0xFF == ord('q'):
cv2.destroyAllWindows()
break
if __name__ == "__main__":
main()