Jump to content

Dewdman42

Members
  • Content Count

    21
  • Joined

  • Last visited

About Dewdman42

  • Rank
    Member
  1. Dewdman42

    Scoring To Video In Samplitude?

    Cool. Well I have said what I need to. I will check back in a few months to see how Samplitude is fairing. thanks and good luck everyone.
  2. Dewdman42

    Scoring To Video In Samplitude?

    I am using the Matrox P650. I will try to turn down hardware acceleration later, I'm up against a deadline right now. But honestly, why does media player and quicktime and sonar and everything else with video work perfectly fine on this computer, but not samplitude. See my point?
  3. Dewdman42

    Scoring To Video In Samplitude?

    I already have the latest video drivers and directX, etc.. I did not see a DirectShow checkbox. There is an overlay checkbox and I tried it both ways, with bad results both times. Sad to hear that this is the end of the quest. Good luck everyone.
  4. Dewdman42

    Scoring To Video In Samplitude?

    please check to make absolutely sure that the first key press actually moves the video, I know it already moves the timeline, but the first keypress does not move video here. After that it works fine. The only way to be sure is if you have the actual timecode or something burned into the screen so that you can visually verify that the video in fact advances on that first keypress and remains correctly in sync frame for frame with with that samplitude time clock says. It is already an AVI file. AVI files can contain all kinds of different video streams, including DV. As I said before, if samplitude can't handle DV format, then its no good for me. The other problems with the video not playing, the screen blanking out or turning to jibberish, etc.. well..that goes without saying....not functional and I really can't understand why the DV stream'd avi file should be causing that. If it is, then Samplitude needs more work in this area. (shrug)
  5. Dewdman42

    Scoring To Video In Samplitude?

    please check to make absolutely sure that the first key press actually moves the video, I know it already moves the timeline, but the first keypress does not move video here. After that it works fine. The only way to be sure is if you have the actual timecode or something burned into the screen so that you can visually verify that the video in fact advances on that first keypress. It is already an AVI file. AVI files can contain all kinds of different video streams, including DV.
  6. Dewdman42

    Scoring To Video In Samplitude?

    Thanks for the explanation about the timeline grid. That sorta works. More on that in a minute. Another question, is it possible in Samp to have two timelines showing at once? I would like both SMPTE and MBT showing at same time. The arrow keys now do advance one frame at a time. However, the very first frame does not advance the video. So, for example, if I rewind to the beginning. The Samp time display shows exactly the same value I see burned into my video window(I have the smpte times burned into the video visual). When I hit the right arrow key, samplitude advances to the next frame, but the video itself does not advance by one frame. When I hit the right arrow again, they both advance and continue to do so normally, except that the video is remaining one frame off from what the samplitude clock says. if I hit PLAY, then the video itself does not play, but I can see the time counter incrementing, and I can hear the audio track. But the video does not move. When I hit stop the video jumps to the location on the timeline where I stopped and lines up perfectly, but from there the same situation I outlined in the previuos paragraph is true...the first right arrow moves the Samplitude time forward by one frame, but does not advance the actual video. But the subsequent arrow keys move both samplitude clock and video together...however at that point the video is always one frame out of sync with the NOW time in samplitude. So two problems so far, playing does not play the video, and the arrow key advance is somehow not keeping the video in sync. In addition to that, the video screen often goes blank or shows jumpled graphics. Sometimes by resizing the video window it corrects itself, sometimes that doesn't work either. incidentally, in case this is part of the issue, I have the SMPTE offset set to 02:22:38:01 since that is the starting frame burned into the video window which I want samplitude to also be showing the same time in the Samplitude clock. that seems to work except for the first arrow advance not advancing the video. The help file seems to indicate that this smpte offset is more related to syncing with external smpte signals...which is not exactly what I'm doing here, but it seems to work for establishing a particular HH:MM:SS:FF as the starting frame of the project at bar 1.
  7. Dewdman42

    Scoring To Video In Samplitude?

    DV format is DV format. This is the standard format shot by 99% of digital camcorders out there. This is usually the PREFERRED format on most platforms because it is not compressed and therefore requires less CPU to display the video. Further, every frame is preserved which means that as you seek around from frame to frame, its going to be easier to handle. Whenever you deal with compressed video it gets more CPU intensive and more complicated to extract specific frames when moving frame by frame, etc... In most other DAW's including Sonar it is usually recommended to use uncompressed DV format if possible. The only disadvantage is that the file size is large. If samplitude can't handle DV and that is the only problem, that would also be a deal breaker for me, but I will try some other compressed formats also to see what happens. An avi file is not necessarily compressed. AVI and MOV files are merely wrappers around something else internally. An AVi file usually has an embedded video track and and embedded audio track...they are actually separate tracks within the AVI. MOV files allow further for text tracks, I'm not sure if AVI does or not. Whether the actual video file is compressed inside the avi is entirely dependent on the codec that was used to create it. So an AVI file could be DV format video or it could be MPEG or it could be any one of a variety of different video streams that require a special codec to display. Unfortunately I am using the demo version of Samp right now, so I cannot try MovieEditPro to create this proprietary video format. It sounds like Samp is optimized to work with this Magix format and everything else is wishful thinking? no uncompressed should be easier than compressed for frame-by-frame. See above. Another compressed format I have seen recommended a lot is Mjpeg (motion jpeg) which also preserves every frame but uses jpeg to shrink the size of each frame. It produces files that maybe 30% the size of the full DV stream, but generally will be more easily handled by most DAW's because every frame is completely intact. I really don't understand why the video being uncompressed should have anything to do with it. But regardless, if that is true, its a deal breaker. I'm rather surprised to find out this is the case. I don't need video editing. I don't need multiple video tracks. I don't need to slide video tracks in time. I just need a daw that doesn't spontaneously hide the video window, that allows me to advance frame by frame (by the way, samp does kind of advance frame by frame with the scrub wheel, so I know it must be possible. the scrub wheel just is hard to control), that displays the video thumbnails in the track view, that plays when I hit the play button, that stays in sync with the VIP project, and that can handle pretty much all the major video formats out there, especially including DV uncompressed video. You're right..if Samp isn't there yet, then I can't migrate. Sonar does all this perfectly. But Sonar's PRV sucks compared to Samp.
  8. Dewdman42

    Scoring To Video In Samplitude?

    I tried right clicking on the video window and I could not find any way to set the arrow keys to frame by frame. The scrub wheel does work, but its difficult to control precisely for frame by frame accuracy. Anyway, I get the feeling that the arrow keys are effecting the main track view directly, which in-directly positions the video window as a remote slave kind of thing. Still, I don't know how to configure samplitude to move frame by frame with the arrow keys or with any other keys. Do you know? The fact that the video window will occasionally black out, that the video is not playing when I hit play on the transport, that when I close the video window it goes to some hidden place and won't come back again until some random time later, etc.. these are all flukey reasons that I have to take a pass on Samplitude for video scoring for now. but if these things are fixed in the future, post to this thread, because I want to like Samplitude...but video scoring is a major thing for me and Sequoia is not an option.
  9. Dewdman42

    Scoring To Video In Samplitude?

    no. mini-dv, which is not MPEG and is about as uncompressed as you can get. Samplitude should not be having any problem with it. Sonar and quicktime both handle it fine.
  10. I have been trying to use Samp10 demo to see how it would be for scoring to video and so far, not impressed. I can't get anything to work right. I found the medialink page under project options and chose my NTSC DV format video. It loads, imports the audio. I hit play, the video does not play along. I use the arrow keys to advance the video forward and back 15 frames at a time. Is there a way to advance only one frame at a time? Anyway, when I do that, the VIP now timeline seems to move forward and back. So it appears to be linked up. However, hitting play on the transport does not cause the video to play There is no video timeline on the VIP. In the medialink page I see two greyed out checkboxes for show current video frames and show video track. Is that the only way to see a video timeline? How do I get a video timeline. How can I move one frame at a time? How can I get the video to play along with samplitude? Furthermore, after futzing around a few minutes, the video window changes to contain an odd assortment of desktop graphics(ie, not my video content), and I can't seem to get the video back. I try closing the window, but looking under the window menu after that shows it still checked. I uncheck it and check it again, it does not come back. Later on I'm futzing around and it magically appears again. Either there is some magic setting to make this work or I have to say, it does not work very well. Anyone can help me?
  11. Dewdman42

    Question About Midi Resolution

    Frank, Thanks for such a nice explanation. I do indeed like the direction you guys are going with the double float precision internal timestamps. And the ability to lock midi events to exact sample points is great, with the future possibility of extracting sample accurate midi events from, for example, a recorded audio drum track...and be able to use the EXACT same groove as the audio, sample accurate without any jitter introduced by virtue of some lower PPQ that would resolution-quantize the material. I like it!! With time you may even be able to come up with a collection of groove templates that ship with the product that everyone can use, that just groove way better than any old midi based groove templates, because of the finer resolution. Another question, what about when it comes time to export that midi track as a Standard Midi File? What values will be used for each midi event? You lost me about about what exactly that 64bit beat position is you are talking about. I know in my case I deal with crazy tempo maps that change all over the place and I am constantly having to tweak them(for film scoring). I would need to be able to change the tempo at any time or place and have the midi data adjust in a way that makes musical sense, not adhere to a realtime factor. For example, if I record a midi event that is 12000 samples after a beat, then I change the tempo....the value should not remain 12000, it needs to move the event in some way makes sense musically for the tempo change. Traditionally, in most sequencers, a midi event is at a certain M:B:T in the track. The size of T changes with the tempo so that the events are always at the right distance from the beat relatively speaking as you change the tempo. You mentioned that things can get a little tricky related to this, can you please clarify? I would think that it might be an option to have midi events be locked to an actual point in real time instead of a traditional M:B:T. But that seems like the exception more than the rule to me. Otherwise, we are forced to pretty much figure out the exact tempos ahead of time and then record the midi data and don't change the tempo ever if we want the midi track to continue to sound good. Regarding the 1ms OS limitations, I totally hear you, there is a lot of slop in both the midi hardware as well as the windows OS, probably well more than 1ms for most people. So the reality is that there is currently no way to really capture sub-millisecond timing nuances from a midi controller into a sequencer of any kind unless you have hardware timestamps available, which nobody on Windows does. However the DirectMusic driver does use a sub-ms timer, FYI. Lots of times it will mean editing the midi data, either with quantizing, or manually, to move midi events to better musical locations, and I gather that Samplitude has much finer precision than anyone else for doing this...providing that the beat position is based on a musical position, not a realtime position after each beat. I guess what I am saying, is that I'm not sure I want samplitude to capture those 1ms timestamps exactly as is. I want Samplitude to be able to round the values out to a PPQ grid resolution of my own choosing. This would be the equivelant of having an input-quantize that is set to 1536ppq for example. And I still don't understand the purpose of the PPQ field in the Samplitude project options. If everything is being stored as 64bit beat position, then how is the PPQ value used? let me relate to Sonar operation just so you can understand my questions better perhaps. In Sonar, you set a PPQ resolution for the project. It has a max of 960, with maybe about 10 preset resolutions. You can't type in any random value like you can in Samplitude. I found out from Cakewalk that the purpose of the PPQ is only to effect how midi events are displayed in the PRV and event list. But that internally they always use a resolution of 960 to store the midi events for the track. If I then change the project PPQ setting to say 480, the midi events retain their original 960ppq resolution in their storage, but the actual display of those events is rounded out to a 480ppq grid. If you move events around, then they will be on the 480 grid, but if you don't move them, then if you change the project PPQ setting back to 960 you see the original 960 resolution timestamp. In other words, the underlying storage resolution did not get resolution-quantized to 480ppqn unless the midi event was moved or changed according to that lower PPQ setting. I actually don't like that aspect of Sonar. If I set the PPQ to 768, then I want the data quantized to that grid. I actually do see value in material being quantized to musically sensible grids like 768 or 1536 instead of 960....... While I like the idea of the 64bit precision beat position you guys have introduced, I like it for the groove matching possibilities you mentioned, that is exciting. But I think if input midi data is resolution-quantized to 1ms grid, that is not any better than the 960 grid imposed by Sonar. Its a non-musical random quantization. On the other hand, if it were possible to have input notes rounded out to 768 or 1536ppq, then the final calculated timestamp that is stored would be in a more musically metrical location in the track.... Here is a diagram to explain what I'm talking about... The interesting thing is that this shows a traditional sequencer's PPQ grid, at several resolutions. Across the bottom are 1ms gaps, but the interesting thing is that the 1ms intervals are not usually going to be lined up with the PPQ grid. They will be offset by some random amount. But during any 1ms period of time, one or more midi events come in, they get timestamped by the trailing edge of that 1ms gap, and then in a traditional sequencer, they would further get rounded to a PPQ resolution. Notice how 960ppq has hardly any ticks that line up in metrically musical spots, but under 384, 768 and 1536 the locations are more musical (at a very fine level)? Anyway, Sorry to get off on such a technical tangent... I have been following Samplitude for a while now for many reasons. I really really really love the PRV and staff view which I think is the best in the world right now. Midi timing issues are crucial to me. I'm not too happy with Sonar's options either to be honest. I plan to try to put Samp10 demo through some intensive sessions to see if it will work for me, if for no other reason, the PRV/staff view which I think is pure brilliant. Thanks again for the clarifications.
  12. Dewdman42

    Question About Midi Resolution

    Thanks so much for responding to my questions and so quickly! I know I may sound a little nuts asking some of these things, but I have been involved in some long discussions related to midi timing, so I'm just trying to get to the bottom of what samp could or could not do for me over the other DAW's (which are flawed in this area). Sure, I realize that 1ms is the windows timer and that is perfectly reasonable and what most other sequencers do as well, though some are able to utilize the DirectMusic timestamp instead of using the one they generate in the sequencer, which may or may not be perhaps lower level and more accurate depending on who you talk to. A few very lucky souls are able to use hardware midi timestamps with the right setup, but unfortunately the industry never standardized on that. In those cases, some sequencers can ignore their own timestamp and use the one provided by the driver. But my question is rather this. Approximately every 1ms, Samplitude will go see what midi events are waiting and assign a timestamp for those midi events in the track. Will that timestamp be whatever the current sample count is at that resolution, or will it be something based on M:B:T? The reason I ask is because the math that happens during this operation will have to round to the nearest M:B:T if that is used, and at this point I'm not sure how large that resolution is. I know in Sonar its hard coded to round to 960ppqn always, regardless of what the midi tick resolution is set to. However, if the calculation is done to a sample count resolution, then there will be less rounding error. NOBODY else does that. If samplitude does, will be impressed. Does it? If it does, then can i assume that any timestamps generated by a lower level driver as I explained earlier are ignored by Samplitude? What is the "PPQ" entry field in Samplitude project options designed for? What does it do? What is the max value I can put there? What is the effect of using this related to how the midi event time calculations are performed at the time the events come in every ~1ms and what is stored internally in the track and what do I have access to to change. I see that the example project has this value set to 384. What is the significance of that? Can I set it higher? You lost me a little bit there. I assume you mean that a note is stored on the track as M:B:? where the "?" is a sample position offset from the beat? But now that I think about it, that gets a little wierd with changing tempos. When using traditional M:B:T ticks, the time value of a tick changes with the tempo. Ok. thanks. Are there any plans to increase the resolution options for quantizing? Heh heh. Well let's not start a debate about who can hear what. No, I don't personally think 384 is enough resolution for a lot of reasons that I won't bore you with here, though I agree that many people will probably be happier using 384 since it will impose a sort of automatic tightening of their material in a way that makes metrical sense. 960ppqn actually does not make metrical sense, I'd almost rather use 768, for this reason. 384 has the same attribute though more course. 1536 would be the ultimate for me. I have created a diagram at one point showing the rounding errors that happen during 960 compared to 768 if you are interested in it, PM me. A phd somewhere actually ended up using it in one of his lectures. Anyway, the point is, Sonar is hard coded to 960 which is a pet peeve of mine. Internally it always rounds events to 960. Are you saying Samplitude always rounds events to 384? I'm not sure if I would prefer 384 over 960, maybe yes, maybe no, but 768 or 1536 would really be my preference if a sequencer supported it. No. If the tick resolution is 384, then a certain form of quantization is happening to all incoming midi notes. They get rounded out to the nearest tick. 384 divides better into 3/4 and 4/4 meters than say 480. Even though 480 is higher resolution, it would round all notes to oddball places in terms of musical meter. Obviously 960 is better than 480, but its debateable whether its better then 384 and definitely may not be better than 768. 768 or 1536 generally speaking may assign notes to the track in locations that are in better musical places metrically. PM me if you want and I will send you that diagram. I don't really want to start a big discussion about it here, just trying to find out what Samplitude does. You caught my attention with the statement about it being sample accurate into VSTi's, but I just want to understand exactly what that means. Thanks again!!!
  13. The new features in S10 state that it now has sample accurate midi vst timing. What does that mean exactly? In most DAW's one can specify a midi resolution to use(for example, 960ppqn). This is often the resolution used while capturing performance from a midi controller....and while quantizing that would be the highest level of resolution by which notes can be moved around, etc.. I think I can guess this means that I can position a midi note exactly where I want, down to the sample resolution, which is far more precise than say 960ppqn, and samplitude will make sure the VST plugin is given that sample accurate timestamp for the midi event to take place. Fine so far? What happens during midi performance capture? How are the timestamps handled? Does Samplitude use the souncard as the midi tick generator based on the sample count? If so, what about midi interfaces or low level midi drivers that provide midi timestamps? Are they converted accurately to a sample based timestamp? What if I want to constrain the input midi so that incoming midi events are stamped according to a certain PPQN resolution such as 960 or 1536? I see there is a field I can specify this value, but how is that used exactly? Are midi events always stored in the midi object with sample based timestamps regardless of this PPQN setting? Would it be possible for me to quantize midi events in a track or midi object to a PPQN resolution I specify? Thanks
  14. First a question, someone earlier mentioned to register to get onto the main forum. I was under the impression that only registered users of Samp9 can get on that forum. Is that true? I have not purchased Samplitude yet, but I have been playing with the demo and monitoring these forums. I have noticed that this forum here tends to be ignored though. Regarding Quad vs dual.... I have had some email conversations with the guys at ADK about this. Bottom line: The quad will handle more tracks for mixing and overall is definitely more CPU power than the dual, AS LONG AS the latencies are set higher, say a buffer size of 512-1024. If you want to use lower latencies like a buffer size of 128, then the quad starts to perform much closer to dual performance, and if you got to even lower latencies the quad UNDER-performs the dual. So it just depends on your needs. If you can live with buffer >=256 most of the time, then the quad is definitely more juice, but like others have said, next gen might be better. IMHO, this is all related to the fact that regardless of how many CPU's or cores you have, you still have some hardware resources that are singular pipes. Memory i/o, disk i/o, caching, firewire i/o or PCI buss i/o, etc... If you have more cores than you do pipes to those resources, then the various cores have to wait their turn to access them. If you have a larger buffer setting, then they will have plenty of time to wait in the buffer for their turn to those resources, which also means lower cpu utilization. However, if you lower the latency, then all 4 or 8 cores will be competing for the same bottleneck and you'll get a traffic jam that either chokes the bottlneck itself(and you get pops and clicks) or pegs the CPU, or both. it may be quite a while, IMHO before quad and 8 core motherboard technology is improved to really streamline the parallel flow of data to hardware resources in such a way that this bottlenecking does not occur. Unless you are primarily going to use your machine for large latency mixing, you're better off with a dual. it doesn't even matter what Magix may do to samp to take advantage of the cores.
  15. Please make sure you follow up with any results, improvements or information you get about this as its very critical for me to understand what is going on and whether its fixable.
×