qSound is an interactive Python project that translates real-time audio input into visual quantum state representations. Using PyAudio for audio capture, Qiskit for quantum simulations, and OpenGL for rendering, this project provides a unique visualization of sound properties such as amplitude, frequency, and phase as they influence simulated qubit interactions.
qSound captures real-time audio input and decomposes it into essential components like frequency, amplitude, and phase. These components are then mapped to quantum states, providing a visualization of quantum behaviors influenced by sound. It’s ideal for those interested in the intersection of sound and quantum mechanics, and could also be to some extent an educational tool, though primarily serves as a fun, extremely overcomplicated, audiovisual experience.
- Real-Time Audio Processing: Captures live audio input with customizable device selection.
- Quantum Simulation and Visualization: Converts sound parameters into quantum states and displays them in 3D.
- Particle-Based Visualization: Creates a particle system responsive to the audio’s intensity.
- Neural Network Intensity Determines a time period of a sound wave's intensity to determine fun visual flavors
- OpenGL Rendering: Provides real-time rendering for quantum transformations and particle effects.
- Python 3.8 or above
- PyAudio
- Qiskit
- Librosa
- OpenGL
- Virtual Audio Device (e.g., BlackHole for macOS)
audio_processing_pyaudio.py
: Handles audio input using PyAudio, extracting amplitude, frequency, phase, and other features.quantum_process.py
: Defines and applies quantum transformations based on sound attributes. Creates quantum registers and applies state changes to qubits based on sound parameters.finalscript.py
: Initializes and manages threads for audio capture, quantum processing, and OpenGL rendering. Coordinates the components in a main application loop.
The audio_processing_pyaudio.py
script is responsible for capturing live audio input and analyzing various sound features. It uses PyAudio for real-time audio streaming. Key steps in audio processing include:
-
Amplitude Calculation:
- Measures the signal strength by computing the root mean square (RMS) of the audio sample.
- This value is used to control the strength of quantum transformations in the simulation.
-
Frequency Detection:
- Utilizes Fast Fourier Transform (FFT) to determine the dominant frequency within the audio sample.
- The detected frequency is mapped to qubit oscillation rates, influencing their behavior in the visualization.
-
Phase Extraction:
- Determines the phase angle of the dominant frequency component.
- This phase information is applied as a rotation parameter in the quantum simulation, affecting qubit orientation.
Other parameters such as the BPM of a sound wave are approximated to further scale the rate at which we rotate qubits in the bloch sphere.
The quantum_process.py` module leverages Qiskit to create a dynamic quantum visualization that responds to audio input:
-
Quantum State Mapping:
- Sound features are mapped to quantum transformations as follows:
- Amplitude -> Controls the rotation angle of qubits, dictating the strength of transformations.
- Frequency -> Determines oscillation rates, influencing the frequency of qubit rotations.
- Phase -> Sets phase shifts, directly affecting qubit orientation in space.
- Sound features are mapped to quantum transformations as follows:
-
Quantum Circuit Construction:
- Register Initialization: Six groups of quantum registers are initialized in different basis states, including:
- |+⟩ and |−⟩ states: Created using Hadamard and Z gates.
- |0⟩ and |1⟩ states: Standard basis states.
- Y-basis states |i+⟩ and |i−⟩: Generated by combining S gates and Hadamard transformations.
- Sound-Based Transformations: Each group of qubits is transformed based on its mapped sound parameters, creating diverse behaviors in the visualization.
- Entanglement: Qubits in different groups are entangled, allowing complex interactions and collective responses to sound input.
- Register Initialization: Six groups of quantum registers are initialized in different basis states, including:
-
Visualization:
- The transformed quantum states are rendered as a wave in an OpenGL environment.
- Particles represent intensity changes, creating an immersive visual experience based on real-time audio features.