PRE2024 3 Group15

From Control Systems Technology Group
Jump to navigation Jump to search

Group members: Nikola Milanovski, Senn Loverix, Illie Alexandru, Matus Sevcik, Gabriel Karpinsky

Introduction and plan

Problem statement and objectives

Users

Visual/Performance Artists (DJ's), Music Producers, VTuber

User requirements

Approach, milestones and deliverables

  • Market research interviews with musicians, music producers etc.
    • Requirements for hardware
    • Ease of use requirements
    • Understanding of how to seamlessly integrate our product into a musicians workflow.
  • Find software stack solutions
    • Library for hand tracking
    • Encoder to midi or another viable format.
    • Synthesizer that can accept live inputs in chosen encoding format.
    • Audio output solution
  • Find hardware solutions
    • Camera/ visual input\
      • Multiple cameras
      • IR depth tracking
      • Viability of stander webcam laptop or otherwise
  • MVP (Minimal viable product)
    • Create a demonstration product proving the viably of the concept by modifying a single synthesizer using basic hand gestures and a laptop webcam/ other easily accessible camera.
  • Test with potential users and get feedback
  • Refined final product
    • Additional features
    • Ease of use and integration improvements
    • Testing on different hardware and software plaltforms
    • Visual improvements to the software
    • Potential support for more encoding formats or additional input methods other then hand tracking

Who is doing what?

Nikola - Interface with audio software

Senn - Hardware interface

Gabriel, Illie, Matus - Software processing of input and producing output



State of the art

[1]

“A MIDI Controller based on Human Motion Capture (Institute of Visual Computing, Department of Computer Science, Bonn-Rhein-Sieg University of Applied Sciences),” ResearchGate. Accessed: Feb. 12, 2025. [Online]. Available: https://www.researchgate.net/publication/264562371_A_MIDI_Controller_based_on_Human_Motion_Capture_Institute_of_Visual_Computing_Department_of_Computer_Science_Bonn-Rhein-Sieg_University_of_Applied_Sciences

[2]

M. Lim and N. Kotsani, “An Accessible, Browser-Based Gestural Controller for Web Audio, MIDI, and Open Sound Control,” Computer Music Journal, vol. 47, no. 3, pp. 6–18, Sep. 2023, doi: 10.1162/COMJ_a_00693.

[3]

M. Oudah, A. Al-Naji, and J. Chahl, “Hand Gesture Recognition Based on Computer Vision: A Review of Techniques,” J Imaging, vol. 6, no. 8, p. 73, Jul. 2020, doi: 10.3390/jimaging608007

[4]

A. Tagliasacchi, M. Schröder, A. Tkach, S. Bouaziz, M. Botsch, and M. Pauly, “Robust Articulated‐ICP for Real‐Time Hand Tracking,” Computer Graphics Forum, vol. 34, no. 5, pp. 101–114, Aug. 2015, doi: 10.1111/cgf.12700.

[5]

A. Tkach, A. Tagliasacchi, E. Remelli, M. Pauly, and A. Fitzgibbon, “Online generative model personalization for hand tracking,” ACM Transactions on Graphics, vol. 36, no. 6, pp. 1–11, Nov. 2017, doi: 10.1145/3130800.3130830.

[6]

T. Winkler, Composing Interactive Music: Techniques and Ideas Using Max. Cambridge, MA, USA: MIT Press, 2001.

[7]

E. R. Miranda and M. M. Wanderley, New Digital Musical Instruments: Control and Interaction Beyond the Keyboard. Middleton, WI, USA: AR Editions, Inc., 2006.

[8]

D. Hosken, An Introduction to Music Technology, 2nd ed. New York, NY, USA: Routledge, 2014. doi: 10.4324/9780203539149.

[9]

P. D. Lehrman and T. Tully, "What is MIDI?," Medford, MA, USA: MMA, 2017.

[10]

C. Dobrian and F. Bevilacqua, Gestural Control of Music Using the Vicon 8 Motion Capture System. UC Irvine: Integrated Composition, Improvisation, and Technology (ICIT), 2003.

[11]

J. L. Hernandez-Rebollar, “Method and apparatus for translating hand gestures,” US7565295B1, Jul. 21, 2009 Accessed: Feb. 12, 2025. [Online]. Available: https://patents.google.com/patent/US7565295B1/en