You do not have permission to edit this page, for the following reasons:
Free text:
[https://johnnywunder.info/ Smoking on the Porch] Working on incorporating [[Schrödinger equation]] ==Links== Reference Links [https://johnnywunder.info/WebContent/ Local WebContent] [https://www.w3.org/TR/webaudio/ W3 Web Audio API Recommendation] [https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode Mozilla AnalyserNode] [https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API Mozilla Web Audio API] [https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Introduction/Introduction.html Apple HTML5 Audio Video Introduction] [https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/PlayingandSynthesizingSounds/PlayingandSynthesizingSounds.html OLD Apple Playing Sounds] [[Category:CyberSpells]] [[Category:Info]] '''Have fun be free be safe be well.''' ==Guitar Wand== If there's magic in the world it comes from [https://acousticguitar.com/heres-how-to-restring-your-guitar-with-ease/?utm_source=Acoustic+Guitar&utm_campaign=546c81a268-EMAIL_CAMPAIGN_WEEKLY_04062020_COPY_01&utm_medium=email&utm_term=0_c8a2f76447-546c81a268-365179779&mc_cid=546c81a268&mc_eid=9afd9f145a new] strings on a good guitar. {{ServiceItems}} ==Set-up== Setting up necessary toys. ===Recording=== + [https://webrtc.org/getting-started/media-devices Simple Working getMediaDevices] * [https://webrtc.github.io/samples/ Media Source Selection] * [https://pubs.shure.com/view/guide/SM81/en-US.pdf SM81 Mike] ===Tuning Ubuntu Studio=== * [https://help.ubuntu.com/community/UbuntuStudio/UbuntuStudioControls Community Link] * [https://www.johnnywunder.info/WebContent/manuals/waveform-user-guide-v5.pdf Waveform Users Guide] ===Distribution=== * [https://xiph.org/flac/format.html flac] ==Lightshow== I'm hesitating more than usual here. Normally I'd be hacking up a program and then adapting it as I go to get what I want. Indeed that's how I started out creating and analyzing wav, riff and flac audio files. In the course of that I've closed on flac as my distribution and hope to do a javascript screen display directly from a flac file. I do believe that is the right structure and format but got stuck there for one pretty solid reason. Looking at [https://johnnywunder.info/store/E_TestUntagged.flac simple E string pluck] how would you visualize it? A 7< second seemingly continuous sound geometrically decreasing in amplitude. Let's simplify and assume it's pure exclusive 82hz recorded in 192,000 samples per second. Doing the math that means from pluck onward every {{#expr:192000/82}} samples there's a high amplitude sample that is decreasing. Rounding off and keeping in mind pure exclusive pitch the other 2340 samples are dead quiet. My initial take is I have 3 parameters to work with on the screen # Location # Brightness # Color which I'm leaning toward mapping # Location streamed to screen sequentially as pixels wrapping at screen boundaries # Brightness = amplitude sample with quiet 2340 0(no change) # Color infra red and below ROYGBIV ultra violet and above mapped last recognized pitch from 0hz 20hz human hearing 22,000hz 192,000hz Two things I cannot seem to wrap my head around are # The sound though seemingly continuous and analog is belied by 2340 holes. # What does someone or something whose hearing range is broader than mine hear? ===Tuning=== ====Technical==== # Removed scrollbar as and eyesore<br/>body { margin: 0; padding: 0; '''overflow: hidden;''' background-color: white; } ====Visuals==== Balancing the 192hz music and the screen refresh cycle is the next step. The screen refresh cycle will vary based on device running the browser which I have no control over. Leaning towrd defaulting to 60hz but not sure what that means yet. Setting up tuning parameters in [https://johnnywunder.info/WebContent/Test/Tuning/select.html select.html]. Currently the only working parameter is the number of circles. Looking to restructure [https://johnnywunder.info/WebContent/Test/Tuning/seeTheMusic.js seeTheMusic.js] to run everything on single pass draw so every animation draws everything. ====Sinusoidal Functions==== Bounce function From [https://math.libretexts.org/Bookshelves/Calculus/Calculus_(OpenStax)/01%3A_Functions_and_Graphs/1.03%3A_Trigonometric_Functions Libretext Calculus] Trigonometric Formula f(x)=Asin(B(x−α))+C y2 calculation y2 = freqAmp.MaxAmp sin((key.width-key.x2Count+freqAmp.MaxFreq)/π)+key.y2Count Want -π/2 = x = 0 and π/2 = x = key.width. [[File:TrigParameters.png|800px]] # Radian measure is defined such that the angle associated with the arc of length 1 on the unit circle has radian measure 𝜋π rad. # For acute angles θ,the values of the trigonometric functions are defined as ratios of two sides of a right triangle in which one of the acute angles is θ. # For a general angle θ, let (𝑥,𝑦) be a point on a circle of radius 𝑟 corresponding to this angle θ. # The trigonometric functions can be written as ratios involving 𝑥,𝑦, and 𝑟. # The trigonometric functions are periodic. The sine, cosine, secant, and cosecant functions have period 2π. The tangent and cotangent functions have period π. =====Sinusoidal Values===== [[File:TrigFunc.png|800px]] [[File:TrigFuncDef.png|800px]] [[File:TrigValues1stQuadrant.png|800px]] [[File:TangentValues.png|800px]] ===ChatGPT=== Asking [https://chat.openai.com/chat ChatGPT] how to do this Here are a few examples of music visualizations that employ Schrödinger's equation and/or Heisenberg's matrix: Schrödinger's Dream by Joachim Schöberl: This visualization employs Schrödinger's equation to create a colorful, abstract representation of a piece of music. The video can be found on YouTube: https://www.youtube.com/watch?v=-eRYz7PzsH0 The Quantum Symphony by The Private Life of Cows: This visualization employs Heisenberg's matrix to create a visual representation of a musical score, with the probabilities of different notes or chords determining the size and position of various visual elements. The video can be found on Vimeo: https://vimeo.com/245997645 Quantum Music Visualizer by Edwin McLean: This visualization employs both Schrödinger's equation and Heisenberg's matrix to create a 3D representation of a piece of music, with different aspects of the wave function and matrix elements mapped to various visual elements. The video can be found on YouTube: https://www.youtube.com/watch?v=WgGSCp5_5Y0 Keep in mind that these are just a few examples, and there are likely many other music visualizations out there that use quantum mechanics in different ====AI Generated Templates==== =====IOS===== <nowiki> import UIKit import WebKit class ViewController: UIViewController, WKNavigationDelegate { var webView: WKWebView! override func loadView() { webView = WKWebView() webView.navigationDelegate = self view = webView } override func viewDidLoad() { super.viewDidLoad() let url = URL(string: "https://johnnywunder.info/SmokingOnThePorch/index.html")! webView.load(URLRequest(url: url)) webView.allowsBackForwardNavigationGestures = true } func webView(_ webView: WKWebView, didFinish navigation: WKNavigation!) { title = webView.title } } </nowiki> =====Android===== <nowiki> import android.app.Activity; import android.os.Bundle; import android.webkit.WebView; import android.webkit.WebViewClient; public class MainActivity extends Activity { private WebView mWebView; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); mWebView = (WebView) findViewById(R.id.webview); mWebView.setWebViewClient(new WebViewClient()); mWebView.loadUrl("https://johnnywunder.info/SmokingOnThePorch/index.html"); } @Override public void onBackPressed() { if (mWebView.canGoBack()) { mWebView.goBack(); } else { super.onBackPressed(); } } } </nowiki> =====Creating App Bundles===== As an AI language model, I cannot create the App Bundles for you. However, I can guide you through the process of creating them. Please note that the process may vary depending on your specific development environment and app requirements. It's important to carefully review the documentation for each platform to ensure that you follow the correct steps. ======IOS Bundle====== For iOS, you will need to have an Apple Developer account and follow these general steps: # In Xcode, select the project in the Project navigator and choose your target under Targets. # Select "General" from the top of the screen. # Under "Identity", enter a unique Bundle Identifier. This should be in the format "com.example.myapp". # Under "Signing", choose your team from the "Team" dropdown menu. # Choose a development or distribution provisioning profile. # In the menu bar, choose Product > Archive. # Choose the archive you just created and select "Distribute App". # Choose an option for app distribution, such as App Store Connect, and follow the prompts to upload your app. ======Android Bundle====== For Android, you will need to have a Google Developer account and follow these general steps: # In Android Studio, select "Build" from the top menu and choose "Generate Signed Bundle/APK". # Choose "Android App Bundle" and click "Next". # Choose your module and keystore file, and enter your keystore password and alias key password. # Choose your build type and select "Release" or "Debug" as appropriate. # Choose your destination folder and click "Finish". # Upload your App Bundle to the Google Play Console or another distribution platform. ===Base Development=== ====Multi-colored Bar Graph First Step==== Well the hesitation paid off and instead of rediscovering and coding [[w:Fourier Transform|Fourier Transforms]] , figuring out how to interpret PCMs at different sampling rates and compression standards by using html5 <audio> and [https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode Web Audio API AnalyserNode] I can concentrate on investigating frequencies and amplitudes from 0hz to {{#expr:192000/2}}hz. Created a [https://johnnywunder.info multi-colored bar graph]. Code at [https://johnnywunder.info/SmokingOnThePorch/seeTheMusic.js seeTheMusic] At this point plan is to formalize it setting up a page with ability to pick your audio source. I'll maintain current code at [https://johnnywunder.info/SmokingOnThePorch/seeTheMusic.js seeTheMusic] and see up a visualization page for entertainment and investigations. Two ways to go now: # Try and simulate the water light show. # Do a 'permanent' (for the duration of the song) pixel growth. To do this I'm looking at existing canvas visualization, particularly a [https://johnnywunder.info/WebContent/fireworksTutorial/ simple fireworks] and trying to apply that to the frequencies and amplitudes the guitar produces. The key is going to be finding the highest number of samples working for the [https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode fftsize] and employing the ''magic'' [[w:Octave|doubling of octave]] to an in tune 440hz A guitar strings. ====Step 2 Amplitude==== Set images, color and fade based solely on amplitude. Next step frequency. =====Learning===== Broke algorithm into three part. 0 Frequency 1 Amplitude 2-4 Circle. Communicating state through web document. ====Step 3 Frequency==== Using max frequency to control lineWidth and globalAlpha. As part of this incorporation I created [https://johnnywunder.info/SmokingOnThePorch/select.html select] which allows use to select their own song(be sure you have the url of the song ready to paste). In addition I added automatic resize and full screen for canvas. ====Step 4 getUserMedia==== Go locally live! ===Change Control=== Using git in development with 2 branches sandbox and main. That works fine. What I'm currently trying to think through is how to preserve old versions that I like of [https://johnnywunder.info/SmokingOnThePorch/seeTheMusic.js seeTheMusic.js] without over complicating improving and moving forward. Still a work in progress. Ask user to grant permission to use sound source their machine produces from microphones to Spotify to local library. [https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia getUserMedia()] ==Tools== ===Modularization and Live=== Going to [https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules modularize] the draw routines and create a ''live'' version. Will then port recorded version over to modlarization. ====Branching to AI for Deeper Communication with Machines==== Choosing flac as my record mechanism of choice had allowed me to store the information I'm interested in without losing any content and in a format that can be read a machine versus play speeds. I intent to let a self-learning AI loose on the structured sound dat to see if it can learn to collaborate with me. # Technical/Machine challenges will be [[FLAC Agent Build|here]] # [[FLAC Agent]] progress toward collaboration will be here. ===Flac=== * [https://xiph.org/flac/ flac] ** [https://xiph.org/flac/documentation_tools_metaflac.html metaflac] ===Development Environment=== * [https://getkt.com/blog/linux-command-line-tools-to-dump-files-in-hex-octal-binary/ Dump File] ** hexdump -C * Development IDE ** IDE *** [https://nodejs.org/en/ Node.js] *** [https://www.npmjs.com/ npm] **** [https://www.npmjs.com/package/repository npm repository] *** [https://git-scm.com/ Git] Version Control Repository ===Web Audio API=== ====[[Schrödinger equation]]==== Coming along. [https://johnnywunder.info/ Check it out] Play with the accelerator. =====Determining sound parameters to feed [[Schrödinger equation]]===== ======[https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode/getFloatFrequencyData freqArray]====== The Float32Array that the frequency domain data will be copied to. For any sample which is silent, the value is -Infinity. If the array has fewer elements than the AnalyserNode.frequencyBinCount, excess elements are dropped. If it has more elements than needed, excess elements are ignored. The indexes of the output can be mapped linearly between zero and the nyquist frequency, which is defined to be half of the sampling rate (available in the Web Audio API via context.sampleRate). ======[https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode ampArray]====== The getFloatTimeDomainData() method of the AnalyserNode Interface copies the current waveform, or time-domain, data into a Float32Array array passed into it. getByteTimeDomainData (and the newer '''getFloatTimeDomainData''') return an array of the size you requested - its frequencyBinCount, which is calculated as half of the requested fftSize. That array is, of course, at the current sampleRate exposed on the AudioContext, so if it's the default 2048 '''32,648''' fftSize, frequencyBinCount will be 1024 '''16,384''', and if your device is running at 44.1kHz '''192,000kHz''', that will equate to around 23ms of data. The byte values do range between 0-255 we're floating point so I think maybe different, and yes, that maps to -1 to +1, so 128 is zero. (It's not volts, but full-range unitless values.) If you use getFloatFrequencyData, the values returned are in dB; if you use the Byte version, the values are mapped based on minDecibels/maxDecibels (see the minDecibels/maxDecibels description). ====Old info==== Core [https://github.com/mdn/webaudio-examples code examples] Allows me to work in PCM analyser on supported browsers without worrying about format or synchronization. ===Tool Problems=== # ''CLOSED'' [https://github.com/audiojs/audio-generator/issues/25 pmo1948] know issue. Package not maintained for years. Removing from project as not really require. audio-generator generates an audio source and I have my guitar for that. # ''CLOSED'' Gulp too much work for limited benefit. If automation necessary will try npm scripts. ==Research== * RIFF wav subformat [https://docs.microsoft.com/en-us/windows-hardware/drivers/audio/pcm-high-bitdepth-stream-data-format guid] * [https://archive.org/details/Silence5Minutes8000HzInWav Archive Silence] * [https://mobile.codeguru.com/cpp/g-m/multimedia/audio/article.php/c8935/PCM-Audio-and-Wave-Files.htm PCM Audio and Wave files] * [https://www.jensign.com/multichannel/multichannelformat.html File format for 24 bit 48 kHz 5.1 Multichannel WAV file] ===Web Audio API Visualization and Study=== [https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API/Visualizations_with_Web_Audio_API Visualizations with Web Audio API] <noinclude> ==Metadata and References== {{EpicSection |Section=6 |Chapter=1 |Part=1 }}
Save page Show preview Show changes Cancel