Jump to content

hujairi

Members
  • Posts

    47
  • Joined

  • Last visited

Contact Methods

Profile Information

  • Gender
    Male
  • Location
    Seoul, South Korea

Recent Profile Visitors

4,386 profile views
  1. Hi, Forgive me for these simple questions, but I'm curious to know what's the best way to display all the libraries I have? For example, I'm now trying to build new libraries that includes ambituses of custom-built instruments I will use, rhythmic patterns I am developing, and modes/scales I will be using. I have the following questions: 1. I'm trying to understand the difference between library, def-library, and create-library. I am particularly interested in better understanding def-library and create-library. Are the two functions very different? 2. If I want to see all my rhythm libraries, for example, how can I see what I've already got on my computer? Many thanks.
  2. Very sad to hear this. He was very inspiring and supportive. I learned so much from him.
  3. Just a small question, and I am not entirely sure it'll be a big help but did you try quantizing the MIDI in a DAW first? I've had a similar problem once before and just running through a DAW and quantizing everything for some reason cleaned up the MIDI for me. I hope this helps.
  4. Dear Janusz, How about using the following four parameters: Length (according to range): wind direction (degrees): floating points, between 0 and 360.0 Dynamics: wind speed (m/s): floating points Pitch: temperature (celcius): floating points Chance of randomly modulating length (according to range of Humidity): Humidity (percentage): floating points Do you think this can be done? I've attached a screenshot of what I'm currently working with to receive weather API data. Thanks.
  5. Hello, I am working on a sound installation for two voices (using Soniccouture's Geosonics VST plugin and a vibraphone VST plugin running through a computer). The source of the data is taken live from the internet and is actually API weather data from a city in South Korea. The piece will play for 2 hours every night between early December and early March. The trouble I am having is how to transform the different fragments of weather data into data I could use within Opusmodus. Here's the type of data that I get according to each parameter (the options are either floating points, integers, or T/F). All of this is on my computer as OSC signals. 1. wind direction: floating points, between 0 and 360.0 2. wind speed: floating points 3. sky code (i.e., clear, cloudy, rain, snow): integers 4. precipitation type: integers (codes for each type of precipitation) 5. temperature (celcius): floating points 6. temperature minimum: floating points 7. temperature maximum: floating points 8. humidity: floating points (representing percentage, example 31.1% = 31.1) 9. pressure: float points (time of writing this, pressure of the place I was looking at was 1008.9 while here in Seoul, the pressure is 922.8) 10. lightning: True or False option My questions are: - How can I receive OSC signals within Opusmodus? Is there a function that allows me to do this? - I've never used the Live Coding function in Opusmodus although I've got a basic knowledge of live coding within other environments such as SuperCollider, Sonic Pi, and Max/MSP. I would like to have the data control the live coding functions modulate the music as it plays live. I would really, really appreciate any advice I may get on this. I don't mean to have the community here 'do everything' for me, but I'd appreciate any direction I may get in successfully accomplishing this project. Thanks very much and sorry for any trouble, Hasan
  6. Fantastic suggestion. I'd love to see some focus on those same matters, too. I had asked in the past here on the forum about the Infinity Series and Spectral Techniques. Thanks for bringing this up.
  7. Thank you very, very much for this. I hope to create something new with Opusmodus very soon using this approach. Really looking forward to it.
  8. Hello, I am trying to find different ways of importing partials/frames data into Opusmodus. I am currently trying to use spectral analysis to analyzes certain music/audio phenomena I haven't really seen analyzed before to adapt into my own composition process. The sounds are generally archival or things I have synthesized on the computer through software myself (in .wav format), and are usually between 1 - 5 seconds in length. Is this a good length of time to analyze the spectral data of sounds? I just wanted to understand the following information about spectral tools within Opusmodus. I really would appreciate it if anyone could help me out. 1. Is it only Spear that we can use for spectral analysis? Is something like Audiosculpt also useable? 2. Looking at Spear, when we explore the data, which of the following export formats do we use: 1. Text - Resampled Frames, 2. Text - Partials, 3. SDIF 1 TRC - Resampled Frames, 4. SDIF 1 TRC - Exact Interpolated, or 5. SDIF RBEP? 3. Once I have the data in the correct format, I think I want to create a library of the datasets I important so that I can reinterpret them in the future. If I want to use the "get-tuning" function, for example, and then mapping it across certain instruments within my planned orchestration. What would be an effective way of doing this? One of my main challenges is finding a way to keep all my data organized every time I want to use Opusmodus for a new project. I still find myself using Opusmodus in little blocks as I compose, and I hope to eventually get to the point in which I can compose entire pieces using only Opusmodus. I really appreciate your help.
  9. I haven't yet had the chance to test midi controller integration with Opusmodus+Ableton, but I'd be curious to learn more about your findings...!
  10. Hello, I have been reviewing James Pritchett's The Music of John Cage lately, and was particularly surprised by the detail in which some of the techniques John Cage developed in the 1940s are described such as his "gamut" technique and "chart" technique. I am very curious to see whether you are aware of any attempts to use Opusmodus to create something somewhat similar to these techniques? In theory, the gamut technique is a collection of "sound materials" made before the composition is arranged, and that these sound materials (with varying content, and often containing most of the chords in a piece), while the chart technique is a chart that makes a chart of different possible sounds according to the types of instruments that are used in the piece, only each horizontal axis gives more probability for the appearance of a particular instrument (I wonder if that makes sense). It seems like a good exercise in composition if not anything else, but I'm curious to see how to recreate this in OMN. Otherwise, are there any plans to implement some of John Cage's compositional techniques into OMN's functions? Thanks.
  11. Dear Stephane, Thank you for your fantastic feedback. I remember very well how much you stress the importance of do-timeline. I will also look very carefully at gen-divide and the other examples you mentioned. I sometimes think I need to make my own "headings" inside of Opusmodus with my own explanation for some of the techniques and how they can be applied to my idea of composition. Sometimes, those "headings" I use would change depending on the style I'm trying to compose, too. I guess maybe this is the deeper level of something powerful like Opusmodus: we have so many flexible tools that can do so many things, yet we need to be creative in finding new ways to use those tools to create interesting results. I also need to learn more about the different generative techniques I'm not familiar with personally such as the Rubin Series and the Nørgård Series. It's really fascinating that they are included in Opusmodus, and I guess I need to explore them more. I also want to see if I can use Ircam's AudioSculpt well with Opusmodus instead of Spear. I wonder if there are any benefits of one over the other. I'm just looking into Tristan Murail's process (he used AudioSculpt and OpenMusic), and trying to replicate it in Opusmodus. I hope to start thinking more carefully about these issues so that I can make better use of Opusmodus.
  12. Dear all, I am curious to know whether any of you have any tips or strategies for approaching 'structure' in composition using Opusmodus. I often find that one of the challenges I face while using Opusmodus is that I tend to think in terms of the general structure of the composition and its pertaining restraints. I find it very difficult to think about the microstructures within a piece, mainly I think because I don't know enough about how to approach the idea of microstructures. By this, I mean I generally approach composition within Opusmodus by generating pitches/lengths/articulations, and separating them in terms of sections. However, within each section, I find that I wish I knew of strategies/ideas to generate even more complexity at a very subtle/micro level. Again, I think my main challenge in reaching the next stage in my use of Opusmodus is developing a more clear strategy in regards to structure. I'm very curious to hear how others solve this issue.
  13. Scala files, I realize, are lists of tuning frequencies that can be subsequently mapped to midi. I have in the past imported lists of frequencies into programs such as Max/MSP for other projects, but I'm curious to see how this could be efficiently done in OM. It'd be nice to be able to simultaneously create library lists (that are stored in the OM directory for future use) without needing to manually go through each of the over 4,000 tunings available. I also assume that this matter would be connected somehow to any future OM functions that allow for microtonal music. As a temporary work-around, is there a way to turn frequencies from floating point digits to ratios (if this is of any use), and the ratios would be usable in parameters such as generating pitches, lengths, and velocities (perhaps through remapping procedures)? An example of what the content of an .scl file can be seen in the following:
×
×
  • Create New...

Important Information

Terms of Use Privacy Policy