Thursday, December 25, 2008

First donation received, yay! :)

I'd like to thank Zarxrax from the Doom9 forums for donating our first $20 bucks. This is very significant because it means SOMEONE BELIEVES IN US!

Now, I'd really like to see how I'm going to use that money for the project - and how to activate my Paypal account in the first place, I had left it for later but turns out my debit card expires soon. So I'll have to get a new card # before I start using paypal's facilities.

Anyway - Thank you zarxrax, we really appreciate your generous donation. Merry Christmas, and happy new year!

Tuesday, December 23, 2008

Merry Christmas!

Hello everyone! I'm glad you're still here finding out how Saya's been doing. She's been fine, but our little virtual heroine got bored: I kept working on her inner machinery and forgot to give her a nice skin. Right now she's glaring at me, demanding a new makeover ^^;;;

Seriously, I've been revamping the inner workings of Saya:

* A completely new string class.
* A completely new event engine.
* A base cross-platform application class which will make it much easier to port Saya to Qt or whatever toolkit you desire.
* One-to-one wrappers for wxWidget's dialog boxes... well, almost :)
* A restructured source code tree.
* Tons of bug fixes.

The not-so-good news is that, as I had mentioned, I haven't been able to make a video player for Saya yet, but I'm very close! :) Thanks to all these months efforts, I hope to have a video player working in the next month... provided that I actually have free time to work on it. :)

Merry Christmas/Holidays, and a happy new year!

Sunday, December 7, 2008

Saya developer's meeting #3 (2008-11-28) chat log


<Rigoberto> Good evening..
<nekohayo> hello~!
<rick_777> evening
<rick_777> Ok
<rick_777> this meeting is officially open
<Rigoberto> nice that we chat together tonight...
<rick_777> Tonight we will discuss the progress so far and what's
expected for the next month
<Rigoberto> perfect
<rick_777> First, i have to announce that due to my new job, the
expected development time on Saya might become restricted to weekends
<rick_777> and occasionally on weekdays, depending on how heavy the workload is
<rick_777> Second, as you have seen in the mailing list, an IP lawyer
has volunteered to help us in anything legal.
<Rigoberto> ok
<Rigoberto> that's great
<nekohayo> oh, haven't seen that, but I haven't checked mail today
<nekohayo> I wonder why though
<rick_777> Also, Rigo and I have been working on the playback controls.
<Rigoberto> in case it turns out we violate a strange patent no one knows about
<rick_777> yup.
<rick_777> Anyway - I'm still undecided whether to do the fun and
entertaining GUI stuff, or to keep debugging the boring core stuff
<Rigoberto> yup Rich has been very helpful with that thanks again Rich
<rick_777> it might depend on my mood, but I'd really like to have
those nice round buttons to give the fans some eye-candy
<rick_777> that usually keeps them interested
<Rigoberto> what about both?
<rick_777> Good idea :)
<Rigoberto> :)
<rick_777> Anyway, I don't know how much time i'll be able to work.
The playback controls SHOULD become functional by january 2009.
<rick_777> which is unfortunate, as I see the deadline extending more
and more :(
<rick_777> Perhaps this would be a good time to encourage more recruiting ;-)
<nekohayo> rick_777: you mean actually controlling playback, not just
the visual widgets, I hope
<rick_777> nekohayo: Yes. Actually controlling playback
<rick_777> So we go to the Research and Development section
<rick_777> There's a library called libcairo, that is great for
drawing vector graphics. It might become useful in the timeline
<rick_777> and to modify effects
<nekohayo> cairo actually is part of gtk now, afaik.
<Rigoberto> afaik?
<nekohayo> as far as I know
<nekohayo> cairo is pretty much used a lot throughout the gnome desktop
<rick_777> But to keep my promise about the project, we'll need to
embed the cairo library in saya
<rick_777> to avoid dependency hell
<nekohayo> it's what draws pretty vectors, graphs, widgets, whatever
<rick_777> Rigo, how do you feel regarding your wxWidgets skills?
<nekohayo> dependency hell: hmm. is that really such a bad thing?
you're imposing yourself a ton of additional/unnecessary workload, no?
<rick_777> nekohayo: Dependency hell is what made it impossible for me
to install the kdenlive video editor on my machine. It was
frustrating.
<Rigoberto> Rich: improving... what's my next task?
<rick_777> I don't want to step on the same stone.
<nekohayo> you just have to properly document what are the dependencies<
<nekohayo> so that users can get it compiled easily. every last one of them.
<rick_777> nekohayo: The problem isn't knowing what the dependencies
are, but having to download them if your distro doesn't provide them.
<rick_777> having the source code in the same package is a big
productivity boost for users
<nekohayo> well, if you have some really obscure ones, yes you can embed them
<rick_777> ok
<nekohayo> but cairo is mainstream.
<rick_777> Rigo, here's your next homework:
<rick_777> we still need a jog control
<nekohayo> yeah I see your point
<nekohayo> but it's additional workload for you, possibly. anyway.
<Rigoberto> do we?
<rick_777> to avoid getting the "oh no, this is too complicated!
*brain block*" syndrome,
<rick_777> we'll split your task in various subtasks:
<rick_777> 1) Draw the widget using the wxwidgets functions. Just the
little circle with a tiny circle inside it.
<rick_777> 2) Make it more appealing, like perhaps adding a shadow and
background color
<rick_777> 3) Parameterize the drawing depending on a private variable
(like "angle")
<rick_777> 4) Capturing the mousedown event (the name may be
different) to switch on a "dragging mode"
<rick_777> and popping a messagebox saying "dragging enabled"
<rick_777> 4.1) Perhaps with that you could change some color in the
jog control, like making the little tiny circle glow red
<rick_777> like a LED light, that'd be cool
<rick_777> 4.2) When the mouse button is released, make the tiny
circle stop glowing
<nekohayo> is a jog control really that important?
<rick_777> yes, it is.
<rick_777> it's essential for professional video editing
<nekohayo> from my experience, I never needed one/even knew it
existed. it didn't in vegas.
<rick_777> and there are some hardware widgets that implement a jog
<nekohayo> the thing is
<Rigoberto> Rich: I tought we already discussed this point in the list
<rick_777> you connect them to usb, and the video editor (like Canopus Edius)
<rick_777> reacts
<rick_777> um, perhaps i didn't make myself clear.
<Rigoberto> Rich: and if I remember well we agreed not to include it
<nekohayo> isn't the jog control a bit premature?
<rick_777> i apologize for the confusion.
<nekohayo> when there is no rendering/playback/splitting/tracks
<rick_777> We agreed not to include it in the first "release" of the
controls - which we already have in SVN.
<rick_777> nekohayo: I need the jog control to test the playback engine
<nekohayo> ah
<nekohayo> oh, that explains
<rick_777> right now i'll be able to test it using limited features
<Rigoberto> Rich: have you seen avidemux?
<rick_777> like play, fast forward, etc. But jog control is much more precise
<rick_777> Once, I think. let me check.
<rick_777> ah, i see it. It's pretty basic IMHO
<rick_777> everyone, please look at this.
<rick_777> http://images.google.com/images?um=1&hl=en&q=premiere+pro+jog&btnG=Search+Images
<nekohayo> why yes, it's the pro editing keyboard
<rick_777> see the jog control in there?
<nekohayo> here's a reality check, however.
<rick_777> oh?
<nekohayo> most linux users or home editors don't have this
<rick_777> but our plan is not to make a home editor
<rick_777> but to make a professional editor
<nekohayo> all things start up non professional though, unless you
develop behind closed doors for a few years :)
<rick_777> granted, small steps, but the jog is a step we can make,
and besides, making a jog control will become a great training for
Rigo. It's much easier than making the timeline controls
<nekohayo> alright
<rick_777> no pressure there :)
<rick_777> so, may I continue on your homework?
<Rigoberto> Rich: an important milestone is functional controls right?
<rick_777> Exactly!
<rick_777> and according to the development plans,
<rick_777> we need a functional player for version 0.1
<Rigoberto> Rich: Insted of the jog control can I help in implemeting
the control functions instead?
<rick_777> http://sayavideoeditor.sourceforge.net/roadmap.shtml
<rick_777> the wha?
<rick_777> please be specific, what do you mean with "the control functions"
<Rigoberto> making the controls work
<rick_777> OH!
<rick_777> Well...
<rick_777> the problem is that we don't know how functional the
playback engine is
<rick_777> so i need to do it step by step: Implementing one control,
debugging...
<rick_777> and i'm still not sure what part goes where
<rick_777> so i'd rather have you doing something in a more controlled
(pardon the pun) environment
<rick_777> But if you want, you could work on the skeleton for the
control functions
<Rigoberto> I would feel like If contribution was more valuable if I
can help you making the controls work
<Rigoberto> my contribution I mean
<rick_777> actually, i think the jog is much more valuable
<rick_777> perhaps it turns out that the button control functions are
just one-liners
<rick_777> Ok, tell you what. This month we'll work together on that
<rick_777> if i beat you to it and end up doing it myself (because it
could be very easy), no hard feelings.
<Rigoberto> ok
<rick_777> But the jog control is something i really need, and i don't
have the background to do it
<rick_777> but there are tutorials on making your own widgets
<rick_777> so if you do that, your level would really increase
<rick_777> and your abilities would complement mine, and viceversa
<rick_777> besides, the real reason why i'm asking you to do it... is
<rick_777> i need someone to work on the timeline controls
<Rigoberto> what's the diference between the slider we already have
and the jogĂ‚¡
<rick_777> you know, moving clips around...
<rick_777> the slider should have a "spring effect" that when you drop it,
<rick_777> it goes back to the center and playback is paused.
<rick_777> Let's imagine that you implement a jog control using a
slider (which you won't).
<rick_777> if you drop the control,
<rick_777> the thumb bar goes back to the middle, but playback doesn't
change. Well, the moment you move it, playback is paused and frames
advance as long as you move the control
<rick_777> how should i put it...
<rick_777> jog control is like a car's wheel. You move it right, the
car moves right.
<rick_777> the shutter control (the slider we have) is like the
accelerator pedal
<rick_777> the more you push it, the faster it goes
<rick_777> so that's basically the difference between the jog and the shutter
<Rigoberto> If I understand correctly
<Rigoberto> avidemux does that with a single slider...
<Rigoberto> is there a program that implements your idea to see it clearly
<rick_777> yes, adobe premiere ^_^
<rick_777> let me find a youtube video
<Rigoberto> thanks
<nekohayo> I always thought the jog was a thing of the past that was
made to compensate for VCR/analog devices' lack of precision
<nekohayo> and irrelevant in the world of frames and keyframes
<nekohayo> I guess I'll have to see that video and see how it is
relevant nowadays
<Rigoberto> I like the way avidemux implements it
<Rigoberto> I think its intuitive
<rick_777> okay um....
<rick_777> the problem is, i haven't seen avidemux
<rick_777> but i'll try to install it so i can see how it works
<nekohayo> apt-get it
<nekohayo> :P
<rick_777> not in the repo :P
<nekohayo> wtf, it is
<nekohayo> at least in debian/ubuntu?
<nekohayo> ;)
<rick_777> not in mepis.
<rick_777> ugh.
<rick_777> ok, mind explaining how avidemux works?
<Rigoberto> why don't you install over your winblows installation
<rick_777> OK
<rick_777> booting it up...
<rick_777> downloading...
<rick_777> ah - here's the official adobe premiere explanation for the
jog/shuttle controls
<rick_777> http://livedocs.adobe.com/en_US/PremierePro/3.0/WS810776E4-8A15-4ff5-88B9-E6B712E0BB49.html
<rick_777> and btw, i can see a situation where a professional editor
needs the jog control
<rick_777> the jog control not only works for input video playback, it
will also work to preview the output video
<rick_777> so imagine you're doing a lip-sync editing
<rick_777> and you need precise control over which frame goes where
<rick_777> (in premiere, you would move the jog control and you could
hear the corresponding audio - even if it's tenths of a second
<rick_777> Ok
<rick_777> i'm running avidemux now
<rick_777> i'm loading a .vob file from a dvd
* nekohayo slaps forehead at the overcomplexity of premiere's interface :)
<rick_777> ugh! I need to copy it to the HD
<rick_777> brb
<rick_777> Ok
<rick_777> loaded
<rick_777> so what's the part you want me to do?
<Rigoberto> use the jog disk
<rick_777> what jog disk?
<Rigoberto> beside the video slider
<Rigoberto> over the selection word
<rick_777> well that's not a jog
<rick_777> that's the shuttle
<rick_777> but yes, I see how it works
<rick_777> however it only works when trying to seek a determinate
instant in time
<rick_777> btw, i'd love the shuttle control in Saya to look like that
<rick_777> ;-)
<Rigoberto> yeah it looks nice
<rick_777> do you think you could copy it? ;-)
<Rigoberto> got to talk to the lawyer first
<Rigoberto> :p
<rick_777> lol
<rick_777> anwyay
<rick_777> i still i see no valid reason why NOT to implement the jog control
<rick_777> please enlighten me
<Rigoberto> can you please explain what the jog does
<rick_777> ok
<nekohayo> except development time, nope
<rick_777> imagine...
<rick_777> that the avidemux slider is a toothed like
<rick_777> line
<rick_777> like with gears
<rick_777> but flat
<rick_777> and imagine the jog control is a gear that goes just below it
<rick_777> and connects to it
<rick_777> so if you move the jog clockwise, the timeline slider goes forward
<rick_777> if you move it counterclockwise, the timeline slider (NOT
the shuttle! btw, but the main slider) goes backward
<rick_777> but in a very tiny scale
<rick_777> like one frame per 15 degrees change in the jog
<rick_777> can you see it?
<Rigoberto> so it has large resolutioin
<rick_777> yes
<rick_777> i've needed to use the jog in the past
<Rigoberto> I think I got it but I would like to try it. Do you know
whether there is a trial?
<rick_777> ?
<rick_777> OH
<rick_777> please watch this
<rick_777> http://www.youtube.com/watch?v=VvvxJuY1EGY
<rick_777> it's an Edius Pro promotional
<rick_777> go to 1:34
<rick_777> and you'll see what i want to do
<rick_777> hmmmmmmmm perhaps i'm mistaking the jog with the shuttle
<rick_777> i need to document myself better
<rick_777> either way, see how the mouse is moving clockwise and the
video advances?
<rick_777> pardon me, it's 1:33
<rick_777> actually i think it'd be useful to watch the complete promo
<rick_777> go ahead, i can wait :)
<rick_777> btw, the promo was of course done 100% with Edius
<rick_777> this is where we want to go
<Rigoberto> wow nice piece of software
<rick_777> how does avidemux fare now? ;-)
<rick_777> anyway
<rick_777> did you see how the mouse moved clockwise and
counterclockwise and the video responded?
<Rigoberto> yup
<rick_777> well
<Rigoberto> I still why to try premiere tough
<rick_777> later we might be able to implement such gestures
<rick_777> but i have this idea
<rick_777> have you ever done this in MS Paint?
* nekohayo takes a look
<rick_777> you click on a point, and start moving the mouse clockwise
<rick_777> when in "select" mode
<rick_777> no, wait
<rick_777> better select the "line" tool
<rick_777> click on a point, and then start moving the mouse
<rick_777> clockwise
<rick_777> are you doing it?
<Rigoberto> yup
<rick_777> so, the line follows the mouse pointer
<rick_777> and the axis is constant
<rick_777> now if you draw an imaginary horizontal line....
<rick_777> the two lines would form an angle
<rick_777> the variation of that angle
<rick_777> corresponds to the playback speed when using the jog
(shuttle?) control
<rick_777> the little circle inside the knob
<rick_777> would follow the mouse
<rick_777> the little circle would be the equivalent of the spinning
line you just moved in mspaint
<rick_777> and as it spins clock/counterclockwise, the video advances
forward and backward
<rick_777> as i said, it's a control not so easy to make, but it's necessary
<rick_777> at least for professional users
<Rigoberto> isn't the jog control going to like premiere's?
<Rigoberto> going to look I mean
<rick_777> nope. Premiere's design is counterintuitive
<rick_777> you need to unclick, move the mouse to the left, click,
move it to the right, etc
<rick_777> it's a mess
<rick_777> if you make it circular, you could theoretically advance a
whole blockbuster movie just by moving the mouse in circles
<Rigoberto> I thought you said you wanted ours to look like it
<rick_777> the shuttle control, yes
<rick_777> but i think avidemux's is prettier :)
<Rigoberto> avidemux shuttle is very similar to premiere's jog
<rick_777> yes, but premiere has a shuttle, too!
<Rigoberto> yup just above the jog
<rick_777> it's the tiny slider just above the jog
<rick_777> as you can see, the style varies, but the functionality is
more or less the same
<rick_777> so here are my new requirements:
<rick_777> * Make the slider look like avidemux's shuttle control
<rick_777> or jog, or whatever it's called.
<rick_777> * Make the jog control circular, like a knob with a red
"LED" indicating the direction it's pointing to
<rick_777> * Make the buttons behave just like avidemux (i'll take care of that)
<rick_777> So I guess that sums up what we'll work on this month
<rick_777> k?
<Rigoberto> what about the steps you were saying...
<rick_777> that's all about the jog control
<rick_777> let's review them
<rick_777> 1) Draw the widget using the wxwidgets functions. Just the
little circle with a tiny circle inside it.
<rick_777> 2) Make it more appealing, like perhaps adding a shadow and
background color
<rick_777> 3) Parameterize the drawing depending on a private variable
(like "angle")
<rick_777> ) Capturing the mousedown event (the name may be different)
to switch on a "dragging mode"
<rick_777> and popping a messagebox saying "dragging enabled"
<rick_777> 4.1) Perhaps with that you could change some color in the
jog control, like making the little tiny circle glow red
<rick_777> like a LED light, that'd be cool
<rick_777> 4.2) When the mouse button is released, make the tiny
circle stop glowing, and the jog is released
<rick_777> 4.3) if you can, when the jog button is pressed, make the
mouse pointer invisible
<rick_777> 4.4) when the mouse button is released, also move the mouse
to exactly where the tiny red light was
<rick_777> oh, i guess i forgot step 4.1.5)
<rick_777> move the redlight along with the mouse
<rick_777> just get the coordinates of the mouse pointer and calculate the angle
<rick_777> 4.5) The hard part will be implementing a "step"
measurement to calculate how much the mouse was moving in a certain
time, and trigger an event
<rick_777> we can leave 4.5) for last.
<Rigoberto> ok
<rick_777> I'm sure you can do that with a little wxwidgets research
and practice
<rick_777> i KNOW you can do it.
<rick_777> grasshopper :P
<Rigoberto> :)
<Rigoberto> lol
<rick_777> well , that's all for today.
<rick_777> any other questions?
<Rigoberto> are you going to send the chat log?
<rick_777> yes, but my computer might crash and the log could get lost
<rick_777> better take precautions
<rick_777> any other questions?
<Rigoberto> noup
<rick_777> Ok then :)
<rick_777> This meeting (and log) is officially closed.
<Rigoberto> good evening folks

Saya developers meeting #3 (2008-11-28): Summary of Activities

This is a brief summary about what has been discussed in the second
Saya-VE developers meeting.

About the team:
* No changes, but I was e-mailed by a pro-open-source IP lawyer in
case we need any assistance.

About the tasks:

* We discussed the need of having a jog control, and what was the
difference between a jog (advancing / going back frame by frame) and a
shuttle (fast forwarding/rewinding at different speeds).
* Jeff gave us an explanation of how controls work in AVIDemux.
* We discussed a bit about how the button 3D effects should behave
* Rigo has been assigned the task of designing a jog control for the
playback controls.
Here's a snippet of the task in question from the chat log:


<rick_777> to avoid getting the "oh no, this is too complicated!
*brain block*" syndrome,
<rick_777> we'll split your task in various subtasks:
<rick_777> 1) Draw the widget using the wxwidgets functions. Just the
little circle with a tiny circle inside it.
<rick_777> 2) Make it more appealing, like perhaps adding a shadow and
background color
<rick_777> 3) Parameterize the drawing depending on a private variable
(like "angle")
<rick_777> 4) Capturing the mousedown event (the name may be
different) to switch on a "dragging mode"
<rick_777> and popping a messagebox saying "dragging enabled"
<rick_777> 4.1) Perhaps with that you could change some color in the
jog control, like making the little tiny circle glow red
<rick_777> like a LED light, that'd be cool
<rick_777> 4.2) When the mouse button is released, make the tiny
circle stop glowing

About the next meeting:

Next meeting is scheduled, unless notified by the members, on Friday
26/Dec/2008. If anyone has any problem with that, we can schedule the
meeting to be held in January next year, or earlier this month.

Sunday, November 23, 2008

Third screenshot: Playback controls!

With the help of Rigo, I've been able to design the playback controls for Saya.Note that the jog widget (a widget that allows you to skip frames by moving the mouse in circles) is missing, we haven't designed it yet.

But basically this is how it'll look.

Thursday, November 6, 2008

Settled with a new job... good/bad news.

The good news is that I got a new job, and that hopefully will have a fine pay.

The bad news, is that my plans to work on Saya fulltime (with your donations) will no longer be realizable. Also, my workload for the next weeks will be somewhat heavy, so the development may slow down. Sorry.

Monday, November 3, 2008

Finally! Out of the bottleneck!

The hardest part in the design and implementation of a Video Editor is the streaming module. Since 3 months ago I've been breaking my head trying to implement it, but I've stumbled upon various obstacles, namely Audio / Video synchronization, audio buffers, handling of the differences between streaming (encoding) and playback, and dealing with latency issues.

I spent nearly two months rewriting the code to use lock-free data structures (that will take care of latency issues), using 4 separate threads instead of one for playback, and finishing the design of the Audio/Video Input/Output devices.

And I'm glad to announce that the most difficult part has been already coded! This month I'll be rewriting the demo to use the new playback engine, and altho the audio part hasn't been implemented yet, most of the design has been already settled (class AVController is 33% done - but trust me, that 33% was pretty hard!).

Now I've told Rigo to hurry with his implementation of the playback controls so we can start testing and debugging the playback.

I'll be updating the project status in the next few days to reflect the changes in the repository. All rejoice!

Friday, October 24, 2008

Saya-VE developers' meeting, Oct 24 / 08

Highlights:

* There are some people interested in joining, but couldn't (or forgot to) join the chat.
* I couldn't work on the project since I've been very busy at work - same for Rigo
* All pending work will be revised on the next meeting

The next meeting will be held on Friday, November 28 2008 at 8PM CDT.

Thursday, October 9, 2008

Saya development may stall for a couple of weeks...

I need to make an announcement. I'm going to install a new Linux distro on my box (I just can't stand PCLinuxOS anymore, there's so many things I need to run and they're not available on the repositories - plus, my install got borked somehow and I can't fix it. Primarily because the kernel I installed (2.6.24) isn't available on the PCLinuxOS free repository. Sigh), so polishing the install and leaving it usable might take me a couple of weeks of my free time. More about it in the saya-dev blog.

Also, I'm going to switch to a new job (where I get better pay) in 2 weeks - so these 2 weeks my boss will have me very busy - and very tired.

So this leaves the development of Saya frozen *gulp* for a couple of weeks (let's hope it's not more than that). I'm sorry. If only there were more volunteers to help me out...

Friday, October 3, 2008

Project now accepts donations!

I finally set up whatever I needed to accept Sourceforge donations. If you want to contribute to this project with your cash, now you can! :D

Donate here:

Saturday, September 27, 2008

Saya-VE 2nd dev meeting (26/Sep/2008) Summary

This is a brief summary about what has been discussed in the second
Saya-VE developers meeting.

About the Team:

* Nopalin wrote me an e-mail explaining some tragic circumstances that made it impossible for him to continue on the project.
* b3rx hasn't reported back after several warnings, so he's been fired automatically. I've removed him from the website page because he never contributed with any code.
* From now on, new members will have to contribute only via patches. On the meeting after a month of activity, we'll decide whether to add them officially to the team or not.
* One person has contacted me via the blog, but I haven't received any e-mail. Let's see what happens.
* So this means the only active developers are me (Rick), Rigobertoc and Javier. And since Javier hasn't reported yet, in the worst case it's only Rigobertoc and me. Recruitment is encouraged.

About the tasks:

* The timeline assigned to Javier Galicia hasn't been completed yet. For some mysterious reason he was unable to attend the meeting, too.
* The threads module has been completed, and Rigo has worked on the Unit test for that. The multithread unit test is a critical task to continue working on the core.
* The data structures for the timeline have been completely rewritten to allow (de)serialization. But deserialization hasn't been worked on.
* The playback visual controls are stalled due to the work on the threads unit test. They've been given a lower priority.

About the website, and Public Relations:

* We have a new blog: http://saya-dev.blogspot.com/ This blog is mainly for unofficial developer documentation and to share stuff we've learned during development - like programming patterns, new algorithms, etc.
* Our project has become an official member for the Open Source Video Editing Foundation. However, no activity or other formalization has been done. Personally (and this is a conjecture), I don't think we'll see any changes in the near future. Let's hope I'm wrong.
* All team members are to e-mail me a public profile so I can post it on the website.
* Sourceforge has changed its web access method. More info can be found at http://sourceforge.net/community/forum/topic.php?id=3518&page&replies=1

About future plans:

* Once the core has been finished, work will focus on gstreamer integration and the playback controls. Once we get the playback, shuttle/jog and scrubbing controls working, the project will be published on freshmeat.net and the first binary will be released on Sourceforge. From that, planning stage will be closed and we will enter "pre-alpha". Things will be smoother from then.

About the next meeting:

Though it hasn't been formalized (nor discussed during the meeting), it is expected for all meetings to take place on the last friday of the month. However, the last friday of October is 31, and it's probable that the various team members will be invited to a party due to the american tradition of Halloween. For that reason, the next meeting will be held on Friday, October 24.

Saturday, September 20, 2008

C++ Programmer Wanted

As usual, we're accepting C++ developers who want to cooperate with the Saya Video Editor project.

The requirements are simple: Being committed and accepting Open Source as a philosophy, having internet at home, have at least 4 hours to develop per week, and of course, a good knowledge and practice of C++. Multithread programming knowledge is a big plus.

Thanks.

Friday, September 12, 2008

Website to be down on Sept. 15th and/or 16th.

I just received an announcement from Sourceforge informing me about an upcoming migration of the data centers on September 15th and 16h. Expect some website failures during that period.

Thank you for your understanding.

Tuesday, September 9, 2008

New "Progress Status" page uploaded.

Here it is. The page that will tell you how far we've been progressing on Saya.

http://sayavideoeditor.sourceforge.net/progress.shtml

It's categorized in sections, and each section is divided in items.
Enjoy.

Sunday, September 7, 2008

"Open Source Video Editing Foundation" created!

A guy from a group dedicated to video editing asked me to join him and get to talk with the authors of other video editing software.

I was reluctant at first - why bother people with other video editors much more advanced than Saya? But He insisted, forwarding his mails to the other people, who joined the conversation. Through a series of e-mails with a bunch of people in the CC: headers, we (I don't even remember who were the other people, this happened too fast) have talked about licensing issues, the (now rejected) idea of forming a spec, having a "core library" common to new video editors (I hope Saya's gets chosen ;-) ), alternative OS's like Haiku (an open source BeOS), Freebsd, stuff about politics, about having talks with Stallman's FSF, and whatnot.

After around 15 e-mail replies, the Google group has been founded. So this thing is getting momentum. We've all agreed that minority platforms (with Linux being the greatest) need a working Open Source Video Editor, and we've informally agreed to share source and tech reports.

Here's the URL.

http://groups.google.com/group/open-source-video-editing-foundation/

There is a website under construction, whose domain is yet to be announced.

Now, I really don't know how this started. It was pretty spontaneous. But there are various Open Source developers in the group, including but not limited to Traverso, Atheme, etc. (maybe it's time to invite the Lumiera guys).

But I guarantee you, this thing is 100% organic grassroot. I was expecting Saya to get attention from f/oss developers, but not THIS soon! In any case, this gives me more incentives to keep working on Saya.

This is getting interesting.... VERY interesting B-)

Wednesday, September 3, 2008

Unofficial dev blog created!

As I keep learning new programming techniques and algorihtms, I decided I couldn't just keep them to myself. The world needs to know.

But that doesn't belong to the project, so I decided I would create a new blog:

"Saya Video Editor unofficial dev blog"

It's available at http://saya-dev.blogspot.com/

This way I can post somewhere else the technical stuff I do about Saya (or about any other thing for that matter), and keep this blog for Press, Media Attention and Public Relations, among other synonyms.

Tuesday, September 2, 2008

Threads module almost completed!

It was like a rollercoaster ride. At first I thought it would be much easier, like "Ok just do your stuff on the wrappers and built a bit more wrappers around pthreads and windows functions. How hard can it be?"

Oh, boy. As I was examining the wxWidgets code that I modelled the classes after (turns out I ended up copying a lot of it - don't worry, all copyright notices are left intact), I realized that Windows XP doesn't implement condition variables (but Vista does - who'd tell?).

Huh?

Ok, to get you in perspective: There are three basic kinds of thread organization objects: Mutexes, Semaphores, and Conditions.

----------- begin boring explanation -----------
A mutex is an object that can be locked by ONLY one thread. The locking of a mutex is called an "atomic operation", it means that you won't be interrupted by another thread while you're doing your stuff. And as long as the mutex is locked, you're safe.

(There is an equivalent of mutexes for threads of only one process, they're called critical sections. Instead of locking / unlocking, you enter and leave them. But they're practically the same.)

A semaphore is like a racing semaphore: One process is the car that "waits" for the semaphore to turn green, another process is the guy in charge of the semaphore. The interesting thing is that ANY process can set the semaphore to green, and this is where things get interesting.

A "condition" is kinda the opposite of a semaphore: Instead of many guys signaling one car, it's one guy signaling ONE or MANY cars. You can signal all (it's called a broadcast), or only one. But a condition requires a mutex for organization, it's kinda complicated. But it lets you send signals to a thread when you want to tell it to do something (if it's not busy already).
----------- end boring explanation -----------
Anyway, here's the trick: Conditions aren't implemented in windows, so you have to implement them using Semaphores and Mutexes (which are implemented).
But in Posix, you don't have semaphores, so you have to implement them using conditions and mutexes.

So here I was, copying code and organizing it. Then I realized that the most difficult part was the mess of creating a thread. It's not just a single call, you needto set up parameters, implement error handling functions ,etc. etc.

And I realized that the posix implementation of threads in wxWidgets was much cleaner than the windows one. So I reorganized it and spared some windows-specific "optimizations" (they're actually more like shortcuts, for example, in Windows you can start a thread in dormant state, but in posix you need to use a semaphore to make it sleep). So by doing that the code looked much cleaner.

I still need to implement/copy/adapt the code for killing a thread, and the code to delete all threads on program shutdown. It's easier now that I've implemented/copied/adapted nearly everything.

When I do my first thread experiment, it'll be quite fun. I spent many nights this week trying to finish this.

The bad news, is that as things look, it's probable we won't have a workable video editor by the end of this year (I began in May, and it's September already? Aw crap). It might take another 6 months or more. But at least I know that I'm much more advanced than I was when I started the project.

But let me tell you. When I get the core finished, things will start to get REALLY interesting. I'll keep you updated.

Friday, August 29, 2008

Saya-VE 1st dev meeting (29/Aug/2008) Summary

Saya-VE 1st developer meeting (29/Aug/2008) Summary of activities.

CHANGES IN THE TEAM:

* Developers who were silently kicked out (suprise! This wasn't
mentioned in the meeting) due to lack of activity and reporting :
Nopalin, Wireshark. Note to Nopalin: You earned points by developing
part of the UI, you have a lot of chances of being accepted again.
Please message me.

* Developers who left: C.J. Barker.

* Developers who joined: Rigoberto C., Javier Galicia

OFFICIAL STATEMENTS:

* Official communications channel for the devs are this group (
http://groups.google.com/group/saya-dev/ )
and gmail chat (aka GTalk, Jabber). For this, all members are required
to get a GMail account and enable their chat in their gmail page.

* The project HQ is located at developer.berlios.de, project's name:
"saya". Here we'll deal with bugs, features, tasks, and the SVN
repository is located there.

* Sourceforge will be used ONLY for the website ( http://sayavideoeditor.sourceforge.net/
) and for releasing binaries / docs.

(oops, forgot to say in the meeting: Devs must get accounts both at
Berlios and Sourceforge)

* Meetings like this one will take place every last Friday of the
month, at 8PM CDT (that's -0600 with daylight savings). The calendar
can be seen at
http://www.google.com/calendar/embed?src=sbij7s3h23o0bhrrt4kmeppisk%4...

* Members will be given write-access to the calendar as they prove
their worth.

* New members must prove their worth by submitting one or two patches
before being given SVN access.

* If a dev won't be available for some time (i.e. vacations), he MUST
post it on this group (there's a specific thread for it).

* A "status with progress bars" page will be added to the website.

* Links with research info will be given to me so I can post them on
the website, under the "research" page.

TASKS GIVEN:

* Jeff is the Vegas expert. Ask him anything about UI design. He'll
also design the progress report with colored bars, which I'll post on
the website as soon as it's given to me.

* Bertrand will work on the CODEC module (GStreamer)

* I (rick) will work on the CORE and RENDERER module. The effects /
rendering part is still pending. If anyone wants to give me a hand
with the thread classes implementation, he's welcome.

* Javier Galicia will work on the Timeline. If anyone knows wxWidgets,
give him a hand.

* Rigoberto C. Will work on the playback controls for the preview
window.

NEXT MEETING WILL TAKE PLACE ON:

Date: September 26, same hour (8PM CDT, that is -0600 GDT).

Server: irc.freenode.net (make sure you register your nickname by /msg
NickServ register ).

Channel: #saya-dev

See the calendar for details.

Note: Remember, this meeting will be devs-ONLY. Foreigners will be kicked out and thrown to the dogs :P

First Devs Meeting a huge success!

Finally I got to get all the team members online. Unfortunately, some devs did not attend and didn't even report themselves. They'll be removed from the project ipso-facto (sorry, gave enough warnings and this isn't a treehouse club).

Another bad news: CJ Barker had to leave the project, his schedule became very tight all of a sudden, and this will be a long-term thing. We'll miss you.

About the meeting:

We talked about ourselves (brief intro), project expectations, official communication channels and how to organize ourselves. I also set up tasks to do. Expect a "Progress bar" to appear in the webpage soon. The meeting log will appear on our private google group.

I'll keep you updated.

Saturday, August 16, 2008

Vacations, and a member is back! Kinda

Hello everyone! Posting from the beautiful lake landscape of Guadalajara. The air's clean in here! :P

I got an unexpected message from one of the more "quiet" members of the team. Due to circumstances above his control, he completely could not login for almost 2 months!

(I'm telling you, this project is cursed! Hard drives crashing, people getting fired and/or having car accidents, is this some kind of conspiracy?)

But everything's going smooth. That other dev is back, and I just found a new programmer from Mexico (thanks OHLOH.NET) very eager to join the team!

So, we might be late on the schedule, but this project is not turning back!

I'll keep in touch.

Tuesday, August 12, 2008

Taking a one-week vacation this friday.

This Friday I'll go visit an old internet friend. I'll also intall Linux on his PC ;-)
So I'll be out for a week, and go back this August 25.

*sigh* 4 months and the project has been very slow in progressing :(

Anyway - If you don't hear from me the next month it means a bus hit me or a plane crashed on me or something. Please pray for my safe return and everything :)

When I return, we'll arrange a devs-only meeting on irc. Stay tuned.

Playback framework, high resolution timers.

I've advanced in the playback framework - the core of our editor. I've designed the AVController class, and I'm in the playback part - the part where you move data from the input to the output and keep the video and audio in sync.

For that I had to program a high-resolution (millisecond precision) function. Basing myself on the SDL's API, I build the syGetTicks() function - it gives you the number of milliseconds that have passed since the program was started. Unfortunately the stupid Windows API didn't have a Unix time-compatible function, so I had to break my head trying to make it work.

The Windows GetSystemTimeAsFiletime returns a 64-bit integer (well, 2 32-bit integers actually) that gives you the number of 100-nanosecond units (Whiskey Tango Foxtrot?!) since 1600. Wha? How am I supposed to convert that?

Well, easy. You just divide it by 10,000,000. And how to do that with 32 bit math?

Easy. Let's use some algebra.

(A * 2^32 + B ) / C = (B/C) + (A*2^32 / C )

The low part of the division is taken care of. And the second, is just as simple:

(A * 2^32) / 10^7 = A * (2^32/10^7) = A * 429.4967296

What do you think? We only have to multiply the high part by a floating point number and we'll get our result. However... we don't want to use floating point math in a high-resolution timing routine. So Instead, we'll do this:

A * 429.4967296 = (A*429) + (A*0.496) + (A*0.0007296)

Luckily, these numbers have fractional equivalents.

A * 429.4967296 = (A*429) + ((A*62)/125) + ((A*57)/78125).

Ta-da! So the final result is:

result = (low / 10000000) + ((hi*57)/78125) + ((hi*62)/125) + (hi* 429);

And we have the WINNT 32-bit equivalent for obtaining the number of seconds since... something.

If at startup we obtain an initial counter, for subsequent calls we only need to substract that number and we'll obtain the number of seconds that have ellapsed since we turned on our PC.

To obtain the milliseconds for the ticks, it was easier. Windows has a GetTickCount() function, but we can't use it since it based itself on the number of milliseconds since midnight. So we just obtain the modulo 1000 and stay with the milliseconds part.

Here are the final functions. The sy prefix is for "saya". Note that the Windows part hasn't been tested yet :P
If you want the final version, please check the Saya-VE source code (SVN) at
http://developer.berlios.de/projects/saya/


/**************************************************************
* Cross-platform High resolution timer functions.
* Copyright: Ricardo Garcia
* Website: http://sayavideoeditor.sourceforge.net/
* License: WxWindows License
**************************************************************/
unsigned long syGetTime();

unsigned long sySecondsAtInit = syGetTime();

unsigned long syGetTime() {
unsigned long result;
#ifdef __WIN32__
FILETIME ft;
GetSystemTimeAsFiletime(&ft);
unsigned long low = ft.dwLowDateTime;
/* We spare the highest 16 bits;
we don't want to overflow the calculation. */
unsigned long hi = ft.dwHighDateTime & 0x0ffff;
result = (low / 10000000) +
((hi*57)/78125) +
((hi*62)/125) +
(hi* 429);
#else
struct timeval mytime;
gettimeofday(&mytime, NULL);
result = (unsigned long)(mytime.tv_sec);
#endif
return result;
}

unsigned long syGetTicks() {
unsigned long result;
#ifdef __WIN32__
result = (syGetTime() - sySecondsAtInit) +
(GetTickCount() % 1000);
#else
struct timeval mytime;
gettimeofday(&mytime, NULL);
result = (unsigned long)(mytime.tv_sec - sySecondsAtInit)*1000;
result += (((unsigned long)(mytime.tv_usec)) / 1000);
#endif
return result;
}

Thursday, August 7, 2008

How to implement the renderers? Draft 1.

Actually this is more like a brainstorm, but bear with me :)

So far, we have been able to make a workable implementation of VideoOutputDevice. It has the following members:


class VideoOutputDevice : public syAborter {
public:
VideoOutputDevice(); // Constructor
bool Init(); // Initialices the output device
bool IsOk(); // Is the device OK?
bool IsPlaying(); // Is the device currently being transmitted data?
void ShutDown(); // Can only be called from the main thread!
VideoColorFormat GetColorFormat();
unsigned int GetWidth();
unsigned int GetHeight();
bool ChangeSize(unsigned int newwidth,unsigned int newheight);
// Can only be called from the main thread!

void LoadVideoData(syBitmap* bitmap);
virtual bool MustAbort();
virtual ~VideoOutputDevice(); // Destructor
protected:
// ...
private:
// ...
};

The renderer must invoke VideoOutputDevice::Init on playback start and VideoOutputDevice::Shutdown
on playback end; he same for AudioOutputDevice::Init and AudioOutputDevice::ShutDown.
Additionally, it must call VideoOutputDevice::LoadVideoData regularly (in case of playback) or for every frame
(in case of encoding). Therefore, it requires a way to know the input's framerate. Also it needs to know the
input's audio frequency.

It requires to be multithreaded so that the framerate doesn't depend on the main thread's GUI being blocked
or something.

Let's assume that it's VidProject which tells the renderer what the framerate is.

So we have:

void Renderer::Init(VideoInputDevice* videoin,AudioInputDevice* audioin,
VideoOutputDevice* videoout,AudioOutputDevice* audioout);


With this we mean we're gonna need new classes for input: VideoInputDevice and AudioInputDevice.

bool Renderer::SetVideoFramerate(float framerate);


And now, onto the playback functions:

void Renderer::Play(float speed = 1.0,bool muted = false);
void Renderer::Pause();
void Renderer::Stop();
void Renderer::Seek(unsigned long time); // Time in milliseconds to seek to


All that's fine, but what happens when we want to display a still frame? We don't know what video output device we
have - a player or an encoder-, so there must be some way to send a still frame to the video device.

void Renderer::PlayFrame();
// (Note that this should be either a protected function or only be enabled when
the video is paused; otherwise we could desync video and audio)


Now that I think of it, sending still frames is what the Video Playing does. Every N milliseconds, we send a frame to the
output buffer. So there must be separate seeks for video and audio.

void Renderer::SeekVideo(unsigned long time);
void Renderer::SeekAudio(unsigned long time);


And if we're seeking, there must be a way to tell if we're past the clip's duration.

bool Renderer::IsVideoEof();
bool Renderer::IsAudioEof();


And it seems we'll need separate video and audio functions for everything (edit: NOT!)

void Renderer::PlayVideo(float speed = 1.0);
void Renderer::PlayAudio(float speed = 1.0);
void Renderer::PauseVideo();
void Renderer::PauseAudio();
void Renderer::StopVideo();
void Renderer::StopAudio();

But I wonder if having separate stop functions would be good at all because of sync issues. I mean, if we don't
want the audio or video to be shown we just don't decode it. It's matter of seeking, decoding, and sending.
So PlayVideo and Play Audio will just disable video and / or audio, and will only need pause, stop.


void Renderer::PauseVideo(); SCRAPPED
void Renderer::PauseAudio(); SCRAPPED
void Renderer::StopVideo(); SCRAPPED
void Renderer::StopAudio(); SCRAPPED

I think that with this info we'll be able to design a good rendering / playback framework.
Stay tuned.

Wednesday, August 6, 2008

syBitmap finished! Now what?

The dev in charge of the Video playback controls is going to be away this week. So maybe it's time to start designing the Renderer API.

Which functions will it have? How will it tell the codecs to read a file? How to handle video and audio sync? How to handle the threads?

Too many questions, any help appreciated. Thanks.

Saya-VE without SDL, experiment 1

Finally, my efforts are beginning to show results. I realized that I had committed several mistakes (read-as: bugs) while implementing wxVideoPanel. While fixing them, I also improved the code a little.

Additional bitmap
The most important bug was trying to save time by not creating another buffer. This caused a crash when resizing the panel under certain conditions. By using another buffer for wxVideoPanel, and updating it from wxVideoOutputDevice::Renderdata(), I could finally be sure that the wxVideoOutputDevice's bitmap was not accessed at the wrong time. As a bonus, this meant that while the video is paused, I still keep a copy of the buffer (oops... now that I think of it, when resizing, the bitmap info is actually lost. I'll fix that soon).

syBitmapCopier
I finished implementing the syBitmapCopier class. Most functions are inline, so no stack space will be used when invoking them (well, some variables were required, but those are unavoidable).

syAborter
I also did some clean up. I moved all the thread functions to syBitmap. I replaced the VideoOutputDevice* pointer from syVODBitmap and replaced it with an syAborter* pointer. syAborter is an abstract class that has only one method: bool MustAbort(), which indicates if an expensive operation must be aborted immediately. Then, I made VideoOutputDevice and AudioOutputDevice subclasses of syAborter.

What this means: syBitmap has all the required functions to be thread safe, and integration with ANY VideoOutputDevice class will be a piece of cake.

Classes cleanup
Now that all the syVODBitmap functions were moved to syBitmap, syVODBitmap was no longer necessary, so I deleted it.

And now, ladies and gentlemen... the demo!

The last bug I had made was calling an expensive wxWidgets function inside a for(x)... for(y) loop. No wonder the display was so slow. But now the wxVideoPanel demo is fully functional. And here it is!


The Demo() function (actually, method) of wxVideoPanel, regularly creates a nice colored image or arbitrary dimensions (the ripples change every 5ms approximately) which is later scaled to fit in the panel's dimensions. This way, no matter if your video is 4:3 or 16:9, it won't be distorted.

After being created, the image is sent to wxVideoOutputDevice via the LoadVideoData() method. This method copies the data to its own bitmap, and then, in its RenderData() method, it calls wxVideoPanel::LoadData().

wxVideoPanel::LoadData() locks its own bitmap and pastes the data. wxVideoPanel's bitmap is locked because there are two other functions that access it (each one locks the bitmap as well) : OnResize, and OnPaint.

wxVideoPanel::OnIdle() checks if new data has been loaded, and calls OnPaint() if necessary. OnPaint() uses a wxBufferedDC to repaint the screen.

This way, we have our nicely colored image which changes in realtime without flickering at all. Ta-da!

Monday, August 4, 2008

From SDL to MyOwnVideoImplementation (TM)

After trying out the SDL video demo, I realized that for a video editor I won't be needing to use sprites, 3D textures or anything like that. It would be esier to write my own Bitmap buffer in memory. So I did, and I ended up creating the wxVideoPanel and wxVideoOutputDevice classes.

Unfortunately, the screen refreshing routines were awful. No, worse. They were hideous. I had to calculate everything manually, handle the pixel color spaces, etc. There had to be a better way. And hence, I came up with syBitmap: A cross-platform implementation of an in-memory bitmap. It has a virtual MustAbort() function which you can adapt for multi-threading purposes.
Currently I've been able to replicate the SDL example, but I was too busy and tired so I couldn't upload the screenshot.

As an added bonus, I created the derived class syVODBitmap (VOD stands for Video Output Device), which also has Lock() and Unlock() functions (also for multi-threading).

The best part is that I could add a PasteFrom function in syBitmap, so that the copying also scales and centers the source bitmap so it will fit the destination. Unfortunately, the implementation isn't as fast as I wanted because it uses floating point math. But I plan to replace it with fixed point math so copying won't become an overhead.

Still, the implementation is both ugly and slow. So I ended up creating another class (which I have yet to commit to SVN): syBitmapCopier. The idea behind it is this: Instead of having to calculate a pointer by multiplying y*width and then adding x, and then obtaining the color format of the pixels, we just init the class with the source and destination bitmaps, and these members are calculated only once.

I have designed functions to copy pixels and increment only the source pointer, only the destination pointer, or both. I've also designed functions to copy entire rows and advance either / both of the pointers by one full row. This way we'll have no worries about having to recalculate parameters for each pixel or passing them through the stack. Who knows, maybe I can inline all of these functions to get a super-efficient bitmap copier.

As soon as I finish the syBitmapCopier implementation, I'll make a multi-threaded demo to see how many frames per second I can get. And then I'll start making the Video playback UI, which is already overdue.

Sunday, July 27, 2008

Saya-VE with SDL, experiment 1

I pasted and adapted some code from the wx-sdl project (see link in previous entry) into Saya. I created a new class called SDLPanel (original, heh?), and used it instead of a common wxPanel.

So far, here's how the painting works:

1) Check if the panel has an active SDL_Surface object
// can't draw if the screen doesn't exist yet
if (!m_Screen) {
return;
}
2) Lock the SDL_Surface. This thing is thread-safe!
// lock the surface if necessary
if (SDL_MUSTLOCK(m_Screen)) {
if (SDL_LockSurface(m_Screen) > 0) {
return;
}
}

3) Create an in-memory bitmap (a wxImage) based on a SDL_Surface's memory. Once you got the image, you create a wxBitmap based on the wxImage. I really hope this isn't double-memory copying and that the wxImage uses the actual memory.
// create a bitmap from our pixel data
wxBitmap bmp(wxImage(m_Screen->w, m_Screen->;h,
static_cast{unsigned char *}(m_Screen->pixels), true));

(NOTE: I use braces instead of less than and greater than because the code
is altered by blogger's html tidying routines)
4) Now that we created the bitmap, we can safely unlock the SDL_Surface.
   // unlock the screen
if (SDL_MUSTLOCK(m_Screen)) {
SDL_UnlockSurface(m_Screen);
}
5) The painting is done with standard techniques:
   // paint the screen
wxBufferedPaintDC dc(this, bmp);

For our purposes, the SDL_Surface can be a software bitmap without 3D acceleration, residing in the computer's RAM. In other words, it's a plain and simple memory buffer.

And I had to load a full-fledged multimedia library for a simple in-memory bitmap!?!? C'mon!! But hey, it works :D

Here's the screenshot (note: the little color lines actually vary in real-time, it's cool).


Sweet, isn't it? Too bad I'll have to change the code back to keep working on this.

But from this we can incorporate the VideoOutputDevice class so we can actually display clips and images. This will be a major breakthrough in our editor.

Wednesday, July 23, 2008

First experiences with SDL

It's time for Saya to get a video viewer, but we havn't been able to do anything yet. Enter wx-sdl.

wx-sdl is a tutorial/sample of using SDL with wxWidgets. It's a single file, and it has everything you need to draw your own video on a screen surface. The best thing is that it's LGPL licensed, so it won't be a problem including it in Saya! :D (two thumbs up!)

There's a class called SDLPanel, but it's customized for the sample there, so we need to adapt it to make it generic. But the integration with wxWidgets is amazingly simple. I hope I will have it included in Saya by the next week.

In related news, I had thought that including the SDL source code would be piece of cake. I guess I was wrong - it's a huge library, so I guess it will go along with wxWidgets in that I need to link to it instead of embedding it.

Fortunately, all linux distros already include it or have RPMs for it, so installation won't be a problem. And you don't need to compile it in Windows either, it already comes with a handy-dandy DLL :)

That means I'll need to post another edition for the developers' guide... sigh.

New developer joined!

Everyone, give a warm welcome to Robert Molnar a.k.a. wireshark. He's an experienced wxWidgets programmer (i think the first assignment i'll give him is to start designing the timeline :) ), and has helped a lot of other programmers at least in the wxForum. He's certainly got what it takes to develop a full-fledged Video Editor such as Saya.

Welcome aboard!

Monday, July 21, 2008

Congratulations! It's a boy!

It's time to celebrate. Our effort has given birth to our first offspring. What the - offspring? Actually, I'm talking about Nopalin's first code commit ^_^ Well, I had to help a little, but the new project dialog is progressing. Congratulations!

Oh - by coincidence, it turned out to be my birthday. So it's a double celebration then :)
In other news, I finally figured out how to implement the video display and keep the current framework design. I'll try to do the first tests this week. Wish me luck!

Wednesday, July 16, 2008

Expensive decision; Improved New Project Dialog

Two good news: I asked for permission at my job to work one hour less (without pay, of course) than I normally do - starting today - , and my boss accepted. This decision costed me a lot of money that I won't earn anymore, but it's certainly worth it - and the result is that today I could finally work a lot of time in Saya.

As proof, and that's the second good news, here's the improved New Project Dialog, based on the GNOME Human Interface Guidelines (not all were followed, but it's an improvement) :

Before:



After:



As you can see, the new dialog is thinner and less cluttered. Also, I removed the redundant "Aspect Ratio" combo box. Pixel Aspect is more than enough for the video settings. If later we decide to scale the project to a certain screen format, I'll add a button to calculate the pixel aspect ratio.

(Note: The cyan background for the text controls is a personal desktop setting, just pretend there was white in there).

Tuesday, July 15, 2008

Developers' guide for Linux uploaded!

Oops. Make that "GNU/Linux". Anyway, thanks to CJ Barker for the draft. The files are available on Sourceforge.

Job + personal problems + stress = ???

Sigh. It's been more than a month since the project was founded, and it feels like I haven't been able to acomplish anything.

These last 3 weeks, I've had too much work at the job, plus I've had personal problems at home. These have left me exhausted and without time or energy to work on the project when I get home.

Without realizing that, I opened the project and saw that very little code has been written. So I asked myself: what the heck have I / we been doing this month? I felt so frustrated and impotent. It's even worse when I can't get a quick e-mail reply from one of the team members. What will happen if I we keep having the same problems / interruptions and we can't get anything done in the next 3 months?

But then I realized that I've written the developers' guide (a task not trivial at all), an explanation of the project design, and the other developers have been organizing themselves.

Yes, it's true, not much code has been written, but it won't be like this forever. CJ Barker has mailed me telling me that he'll have time this weekend to work on the resources panel. Nopalin is putting a lot of effort in learning wxWidgets, and more research has been done.

Tonight I'll start rewriting / publishing the Linux developers' guide (thanks to CJ for writing the first draft).

Anyway I've also noticed a lack of organization. I'm thinking that perhaps we should have monthly meetings on the irc channel - that's why it was created in the first place.

So, please bear with us a little more. We're not perfect, but we're not slackers either. We're very few members and we're doing our best to make this work.

Monday, July 14, 2008

Car accident; Human Interface Guidelines

The bad news: One of our team members had a car accident and he had to be taken to the hospital :( Fortunately, he's fine now, but he'll need a bit of rest. Let's wish him a fast recovery.

The good news: After a brief discussion (but not argument) with some of the team members, I've decided to follow the Gnome Human Interface Guidelines. This will make our dialogs VERY user friendly and not cluttered at all :)

Thursday, July 10, 2008

New dev, engine warming up :)

Two good news: One, we got a new dev, Rigo, which is an old friend from school.
Two, everyone's back from their away status. Nopalin has been studying hard and is learning fast. I hope that the New Project dialog will be completed soon.

Also, the "Welcome" dialog has been fixed from some bugs and some bitmaps have been added to the big buttons.

It seems that the real development has been started! :)

Stay tuned.

Saturday, July 5, 2008

Framework design is up!

The Saya-VE framework design has been uploaded to the website (it's licensed as GNU FDL) . Also, the developers' guide section now points to the downloadable dev guide.

Monday, June 30, 2008

Whew! Developers' Guide finished.

After two long weeks of overnight work, I finally finished the Developers' Guide for Windows. It's available on the Sourceforge page in the downloads section.

Let's hope the Linux devs don't require a developer guide :P

Monday, June 23, 2008

Having second thoughts about gstreamer...

Recently I've been writing on a mailing list about the ffmpeg project, and on why we need an alternative. A little bit of browsing led me to take a second look at Gstreamer. Therefore, my previous post entitled "On Diva, gstreamer and tiers gone wrong" seems to be inaccurate.

In theory, GStreamer is everything a video editor has dreamed of... but I couldn't get hold of the developer guide (it's on development. How ironic). However, I have this hunch, telling me that I won't be able to access frames and audio samples directly, and instead I'll have to rely on plugins such as gnonlin.

Let's hope not. I mailed one of the Pitivi authors for guidance. Let's see what we find out. Cross your fingers.

Saturday, June 21, 2008

How to make a portable Codeblocks project

Perhaps this will help other developers who want to write cross-platform applications with Code::Blocks. After breaking my head for various hours, I finally found out how to do it. Code::Blocks 8.02 includes a script engine which can make certain compiler options apply only to Windows environments.

Example: In your project settings:
Under "Compiler settings", "other options":

[[if (PLATFORM == PLATFORM_MSW) print(_T("-mthreads -D__GNUWIN32__ -D__WXMSW__"));;]]
`wx-config --cflags`


(Note: The double ;; at the end is a workaround a scripting bug)
As you can see, the one-line script is pretty much like C. the _T must have been incorporated for wxWidgets compatibility. In any case, Notice the compilation string: -mthreads, -D__GNUWIN32__ and -D__WXMSW__. The -D is a compiler define. I tried to add scripting to the #defines, but that didn't work, so I had to add them on the compiler command line.

The wx-config part is for Linux. I've found that experimentally, having this option in the project doesn't affect compilation under Windows.

The same can be done with the linker. Under Linker settings, other linker options:


[[if (PLATFORM == PLATFORM_MSW) print(_T("-mthreads -lwxmsw28u -lintl.dll"));;]]
`wx-config --libs`


Here I tell the linker to link two windows-only libraries: libwxmsw28u.a, and libintl.dll.a (libintl is used for internationalization). For posix environments (OS X or GNU/Linux), the backticked expressions are more than enough, but you could do the same with "if (PLATFORM != PLATFORM_MSW).

Under search directories, I add both the Windows and posix directories.

Compiler search dirs:


$(#wx.include)
$(#wx)/contrib/include
$(#wx)/lib/gcc_dll/mswu


Linker search dirs:


$(#wx)/lib/gcc_dll


Resource Compiler search dirs:


$(#wx.include)
$(#wx)/lib/gcc_dll/mswu


With this simple settings, you won't require having two different projects, one for GNU/Linux or Mac OS, and another for Windows.

This will help making cross-platform projects quite easy.

Friday, June 20, 2008

Windows = Pandora's box!

Today was a hectic day. Not only I had to run Windows to get my tax declaration working (the app doesn't run on GNU/Linux), but I had to use Windows to set up Code::Blocks as well.

Once upon a time... the second most active developer, Nopalin, had problems with his harddrive and ended up installing Windows. The pandora box had just been opened on me. I had now to download MINGW, compile wxWidgets, download an additional libintl from gnuwin32 (it doesn't come by default with mingw), install subversion, and guess what.

As I had expected, I had compiling problems with the project, which was configured for a posix environment. I had to modify the global settings for Code::Blocks under Windows (something you must NOT do, that's what the project settings are for!) until I found out how to make the project build properties cross-platform (this is a capability not implemented in Code::Blocks yet, maybe they have scripting capabilities now, but I don't know of them).

And then wxDateTime method FormatISODate() outputs garbage on Windows. When that was fixed, I noticed some other problems: The buttons in the welcome dialog were invisible, and the new project dialog has the wrong size! :(

How did all this happen? Well, I'll be able to find out as soon as I make the project cross-platform. In the worst cases, I'll have to use the same tactic the C::B devs did: Use parallel project files. Ugh. Let's hope I don't have to resort to that. Sigh...

Why "Lone Ranger" programming will never work.

Recently I got an e-mail from someone interested in video editors, pointing me to a novel video editor in the works.

The problem: The thing's written in [obscure programming language which is neither Python nor C++]. Oh, you didn't know there was a programming language called [obscure], right? See, THAT's the problem.

I'll quote some text from the other programmer (I'll rephrase all paragraphs to protect the innocent from Google searches):
Basically I'm sick tired of having a non-working video editor and a lot of novel ideas that other editors lack. And I've rewritten it so many times that I've got to get this off my head.
All those four years I haven't been programming because of school. But this year might allow me to start working on [editor] again.
Later, I read:
Lately I’ve realized that programming takes too much time from my life.... I almost made it work... I’m not a programmer. Programming in [obscure programming language] is my hobby, not my profession. I don’t get any money from it. It's a problem when programming takes about 80% of my time and I don't have time left for normal activities ... my projects will be put on hiatus. Feel free to ask me for rights to commit to the subversion of [editor]. You can take over the project for now. I hope I’ll get back to programming [editor] with a little more peace of mind after next year [written on December 2007].

And this, ladies and gentlemen, is why the Lone Ranger approach to Open Source programming WILL NEVER WORK. Do you have the slightest idea why so many Sourceforge projects are abandoned? It's like the Open Source Project cemetery. Well here's the reason: Those projects had NO programming teams.

I absolutely refuse to work on a hobby project if I have to do it myself, with tools I have to write myself because there are non-existing tools for an esoteric programming language (let alone a binding to [ famous multimedia library ] ).

Perhaps you'll understand now why I chose to use C++ and wxWidgets. There are at least 10 C++ programmers for every [obscure programming language] programmer. And there are at least 10 Windows users for every Linux user. If I'm going to have a programming team and a live project, I better use things where I can recruit most volunteers.

After all, a Non-Linear Video Editor is not something you can make in your garage.

Thursday, June 19, 2008

New mailing list + one developer less = still good!

Today I received a letter from the MIT guy (the one who never answered my mails) asking me to remove him from the project. However, I have the suspicion that either his request to get in or his request to get out was forged (this means someone got access to his account), because they were in very different tones. Too bad - because we can't allow someone with such a bad security get in the project. Reliability: 50% -> zero.

In other news, I opened a private mailing list for the developers so we can organize ourselves better. I'm also working on the developers guide so anyone can install the software required to compile and run Saya.

Then we'll start working, and hopefully in a couple (ok, maybe 4) weeks we will release version 0.1, code name: Aikuchi. That will mark the end of the planning phase and we will officially become "pre-alpha". Keep in touch! :)

Wednesday, June 18, 2008

Class diagram for AVClip

So how are we managing the effects?


An AVClip has a vector of effects.
An Effect consists of a map (string, FXParamTimelines). The string denotes the parameter's name.
An FXParamTimeline is a map of (integer, string parameter), where the integer represents a point in time.

(Notice that there are no bezier curves in this diagram. But there will be, don't worry about that).

Time is measured in milliseconds, but I'm going to replace the integer with a 64-bit integer so I can represent nanoseconds (this is necessary to be able to move audio clips around with one-sample precision in time).

One thing that I forgot to mention is that none of the data structures I present here, have pointers (pointers are EVIL!). This is so we can duplicate the instances of effects and clips easily. Since the indirection is handled through array (or vector / map) indexes, we won't have segmentation faults due to a badly dereferenced pointer. This also makes serializing and deserializing the data a piece of cake.

When copying a series of clips to the clipboard, I will replace the clip id's with new ones. Simpler than a Jedi mind trick. "These aren't the id#s you're looking for." *Waves hand* No pointer mangling, no headaches.

So this is the reason why I'm trying to focus on the User Interface first. Because it's the task which takes most of the time. And since I already got rid of the wxWidgets-specific classes in this framework, we can choose whatever UI library we want. But I'm sticking with wxWidgets for the user interface.

Class diagram for VidProject

Who'd guess there was a tool to convert C++ code into UML diagrams? It's called Umbrello.
Here's a simplified UML Class diagram for Vidproject and associated classes:


The Video Project has a list of the currently used Resources (video and audio clips - i.e. files ). Additionally, it contains an AVTimeline which contains various sequences. Each sequence contains various audio and video tracks. And each track contains... surprise, a clip. The clip has an index indicating which resource it operates on. And of course, a list of effects and a transition.

Notice that a video clipboard is nothing but a sequence. Convenient, isn't it?

As you can see from this diagram, the memory consumption for these classes is minimal, since they're nothing but sparse data structures.

Now, I don't really know how Cinelerra or other video editors work. But one thing's certain: It's much easier to model the classes after the structure of the data we're going to modify. And this data is a timeline.

I already have implemented saving undo/redos for these classes, and you can impose a memory limit on how many undos/redos you can store in memory. So there goes the high memory requirement. Poof! :)

I'll upload this diagram to the webpage when I can.

Tuesday, June 17, 2008

Bugs fixed, we're back in the business!

The annoying configuration bugs that had plagued me for the last couple of days were finally solved (note to self: remember to wrap the functions va_start and va_end around vsnprintf the next time - how embarrassing ^^; Oh well... I guess these things happen when you're programming while undersleeping at 4AM in the morning). Another hint, is that whenever you're dealing with char* strings, DO NOT store them in the stack! If anything goes wrong, you won't be able to debug because the stack will be messed up. Use varname = new vartype[num].

AAaaaaaaanyway...

I added a nifty debug log to our wxApp object, so we can finally know what's happening behind the scenes. This is excellent for hard-to-debug cases. And now, back to business, there are tons of things to do:

  • Fix the updating of recent project menus. I currently use a flag, it's better to use a counter to see whenever it's been updated.
  • Implement the audio/video presets in the new project menu
  • Implement project creation and saving with XML.
  • Keep working on the framework to do some video streaming (Jonathan's beaten me to that already, I must keep up in the race!)
  • Update the website, write the class diagrams, etc.
And now, to get a good deserved rest. G'night!

A new dev comes, an old dev leaves... possibly.

Everybody welcome CJ-Bark, our newest team member.
Also, one of the developers might... "have a long vacation" since I haven't got a single e-mail from him since he joined.

Keep in touch.

Monday, June 16, 2008

Conversion finished! Now onto bug fixing.

The good: The backend conversion from wxWidgets to generic functions has been completed! :)

The bad: It's buggy. Now the Open dialog doesn't show the recently opened projects. :-/

The ugly: I'll have to debug! :(

Sunday, June 15, 2008

Saying goodbye to wxWidgets (partially)

Turns out there were a lot more wxWidgets functions used in the backend than I expected.

After a few hours of rewriting code, I managed to almost replace all. However, I'm left with a few stubs (I only committed a zipfile with the changes, I don't want to break the build in SVN). Here's a list of the new functions and changes:

  • static std::string ioCommon::GetPathname(std::string fullpath); // UNFINISHED
  • static std::string ioCommon::GetFilename(std::string fullpath); // UNFINISHED
  • static bool ioCommon::FileExists(std::string filename);
  • static bool ioCommon::FileExists(const char* filename);
  • class BufferedFile; // UNFINISHED
  • class TempFile; // UNFINISHED
  • const std::string syString::Format(const char* format, ... );
  • const std::string syString::FormatBig(unsigned long bufsize, const char* format, ... );
  • enum sayaEventType;
  • enum sayaYesNoCancel;
  • class sayaEvtHandler; // This will be used for handling the events coming from the backend.
  • class sayaConfig; // A wxConfig wrapper, it's an abstract class.
  • class sayaConfigProvider; // abstract class. UNFINISHED; needs an implementation (wrapping) done in wxWidgets.
  • const wxString std2wx(const std::string& str);
  • const wxString std2wx(const char* str);
  • ALL wxString references in ProjectManager were replaced by std::string
  • Since wxWidgets uses the _() macro, all literal strings will have to be surrounded by gettext().
For now it seems that we have to implement the stubs I created. But after that, the code architecture will be much more robust and cleaner, because the ProjectManager and related classes now only depend on the STL and some basic libraries like stdio. This means that if someone wants to make a frontend for the project in QT, it will be much easier for him to do it now.

Now I have to go to sleep. I'm dead.

Saturday, June 14, 2008

Saying goodbye to wxString

I just had realized something. If I want to make Saya UI-Toolkit independent (except the UI frontend), it means I'll have to add another bunch of wrappers. But it's worth it.

So I'll start replacing the wxString references everywhere except in Main.h and Main.cpp for std::string . This will also help me get rid of the _T() and macros. For internationalization, I guess I'll redefine the _() macro to some internal function, used only by the backend.

It also means I'll make an intermediate class to pass the messages (like presenting dialogs). I think I'm going to like this.

Friday, June 13, 2008

SVN problems and headaches

We're having technical problems using the SVN server at berlios. It seems I'll have to change the access method from svn+ssh to https. But I'm afraid I can't do it right now, I have this horrible headache. I'll keep you updated.

Update: Fixed. To solve the SVN access Issue, I only had to replace the svn+ssh login method by https. I had to do some directory copying, but it was easy.

Thursday, June 12, 2008

Critical Sections and Mutexes

Ah, the joy of multithreading. I really miss the wxMutex and wxMutexLocker classes of wxWidgets, which unfortunately, I can't use for the wrappers (because they're wrappers and therefore they need to be as lightweight as possible. Linking to the wxWidgets library isn't exactly what I want).

The first decision I took was to use the SFML Mutexes class. Unfortunately, I stumbled upon the problem: If I include this for the wrapper, what will happen when one of the plugins actually uses the library? Would I get some duplicate definition? Should I copy the code and rename it?

Forget it. Enter google. I found two great resources for using Mutexes: computing.llnl.gov (which included a copy of the pthreads manpage) and wikipedia. Lucky me, the wikipedia tutorial included a cross-platform version of implementing critical sections! And it's GFDL, alright!

Now let's see if I can find a good example of a cross-platform thread-safe sleep() function.

Update: Done. Now I can go back to work on thread-safety for the audio/video wrappers.

Wednesday, June 11, 2008

OpenVIP: The ace under my sleeve

There's been more than one person who doubts that I can do a video editor alone. The complexities of writing a good Video Editor backend are too great for a "newbie" to try on his own.

And they're right. But I'm not writing a backend. I already have one. See?
http://openvip.sourceforge.net/

From the webpage:

OpenVIP is a free video-processing tool for Linux and Windows. It consists of two parts:
  • OpenVIP core, which can be used for processing multimedia files from command line, or as a C++ library linked to other applications.
  • OpenVIP editor, which provides a user-friendly GUI to the core and is based on the timeline concept - you place multimedia files on the timeline, apply filters, transitions, ...
These are the main features of OpenVIP:
  • Supports AVI, DV, MPEG, MOV, MP3, WMA, and WMV formats (via the FFmpeg libraries) as well as sequences of bitmap files (via the ImageMagick library)
  • A lot of nice plugins including colour transformations, geometric distortions, basic sound processing and transitions between two clips
  • A simple interface for developing your own plugins in C++
OpenVIP was designed by a college group as their graduation project. Unfortunately, they couldn't continue their project. That's where I come in. Or should I say we, since there's already a developing team for Saya.

All we're doing is designing a professional user interface around the OpenVIP Video Editor framework. Because the framework is done in C++, designing the frontend is much simpler than starting a video editor from scratch.

The only problem is that OpenVIP is released as GPLv2, while Saya is GPLv3. I've asked the main developer to release the project as GPLv3 so I can add the framework right away. Perhaps his mail got lost, I guess I'll ask again. (Edit: I just got a mail from Antonin Slavik, the OpenVIP project leader. OpenVIP is now GPLv3 or later with linking permission! Road's clear!)

Still... we can't commit the same mistake other developers have by tying themselves to one framework and later realizing it doesn't do what they need. This is why we're making a whole "interface layer" in Saya, so the video processing and playback frameworks can be inserted as plugins. Some things will need to be implemented on our own (or outright stolen from mature frameworks), like threads, mutexes. There's a 90% chance that I'll borrow SFML's system module to deal with this part.

The rest will be a lot easier to handle.

Tuesday, June 10, 2008

Status report. June 10, 2008

What has been done:

* Organizing the team, getting team members. Some members haven't been able to be contacted live - but we got an Adobe Premiere video editing expert / advisor (edit: And tentatively another advisor, who happens to be a Sony Vegas expert and User Interface Nazi ;-) ) , a beta tester, 3 developers and 1 possible developer more (if I manage to get him online). Additionally, an old friend from school who's a college graduate is offering to join the project - but not before he passes a wxWidgets exam I assigned him :)

* Researching on various fields. Nopalin has been able to do his first wxWidgets programs, and is ready to work on the UI. b3rx is in the process of documenting the OpenVIP classes and some classes that have already been designed. I've been researching the use of SFML and have corrected various misunderstandings in the wrapper classes.

* The pluggable multimedia framework has been started. Abstract class AudioOutputDevice has been completed. Abstract class VideoOutputDevice has been started (as a stub). Actual implementations using classes derived from these, remain to be written.

What remains to be done:

- Completing the New Project Dialog and creating new projects.
- Creating an A/V streaming class for rendering video and audio on-the-fly.
- Creating a plain vanilla video player using the said classes.

If everything goes well, this will be done within the next 4 weeks.

To SDL or to SFML? That is the question

I recently found a multimedia library called SFML (Simple and Fast multimedia library). According to the webpage, " Instead of being one big API, SFML rather contains a lot of small packages, that can be chosen and combined according to the intended usage. You can use only the base package to get input and windowing, as well as the full graphics package with sprites and post-effects." Additionally, it's written in C++ and seems to be very flexible.

That sounds just like what we're looking for. However, there is a problem, and it's that the project is (relatively) too young compared to SDL. How stable and reliable it is?

I'm sure that choosing a multimedia library is a problem many multimedia programmers face. They have to base their entire code on choosing one or another library. This is why I decided to design abstract classes as wrappers, so we don't have to face that decision. I hope that with SMFL we can have a working multimedia player soon.

Monday, June 9, 2008

We got a team! And a website!

This week has been hectic and I haven't been able to sleep very well - but it was worth it. Saya-VE has a website, a Sourceforge page, 3 more developers and a beta tester.

We'll keep you posted.

Thursday, June 5, 2008

on DIVA, GStreamer and Tiers gone wrong

I just found this awesome post from Michael Dominic explaining his problems with the GStreamer framework.

"GStreamer solves a lot of problems on the GNOME desktop but it doesn’t solve the problem of video editing.

Gst is a playback framework, and for video editing you need editing framework. The later is not, as it’s commonly believed, just a superset of the former. The MLT framework is an interesting example of the “video editing” architecture."

A later comment from a reader explains further:

"I would also join the voices in suggesting that you reconsider your decision with GStreamer. In the last six months GStreamer has become more and more suitable for applications such as Diva, and Gnonlin is a particularly useful component here."

And here I was thinking that I could use GStreamer as a basis for Saya. I think I can explain the concepts with a small diagram:

PiTiVi or some other editor
Gnonlin
Gstreamer
Video Hardware

This is, gnonlin is a PLUGIN which goes with GStreamer (if it's not, someone correct me please!). The problem is that this plugin is a video-editing component built upon a low-level playback framework. This is:

Editor UI
Editing stuff
Playback
OS / Hardware

and that's a no-no. It's got the logic ALL WRONG!

Instead, what I want is something like this (the higher, the more user interaction) :

GUI
^
| (commands, events)
v
High-level Editor framework (track / timeline /
effect stuff handling)
^
| (playback info, commands, events)

v
Renderer (effects / mixing)
| |
| v
| Low level
| Playback Framework
| (i.e. SDL)
| |
| v
v Video Hardware

Decoder
|
|
v
Files

In other words, The decoder/files make the MODEL. The Playback/VideoHardware are the PRESENTATION (the GUI is another separate layer of presentation, so they're tied - in a way), and the Editor framework is the CONTROLLER.

Ladies and gentlemen, this is nothing but Model-View-Controller 101.

Any questions?

---------
Update (June 23, 2008) : It seems I was mistaken in my assumptions regarding GStreamer and PiTiVi. However, the MVC pattern is still a rule I'm going to follow. Stay tuned.

Wednesday, June 4, 2008

There can be only one... (not)

At least I'm not alone. Another guy had more or less the same idea that I had, and around the same time. Clearly this is an indicator that video editing in Linux *DOES* have a problem. His project blog is at http://myvideoeditor.blogspot.com/

Alas, it seems he chose another path (he's going to use GStreamer and Python). Why, oh, why??? I really don't know if he'll succeed, but I asked him to join forces. Sigh, if only we had managed to get in touch earlier. Jonathan claims that C++ is a monster hard to debug. Well, not if you use the latest GDB and Code::Blocks. Programming C++ with Code::Blocks is a breeze ;-)

Luke er Jonathan... JOIN ME, and we shall rule the galaxy together! Well, more or less :P

Saturday, May 31, 2008

"New Project" dialog

After researching for a couple of days and designing for a whole day, I finally got a decent and pretty "new project" dialog. So far it's been one of the most complicated things I've done for the project yet, because I needed to get info about all the known editors, and standard video formats (thank you, wikipedia!)

Also, Premiere shows more settings, while Edius Pro's dialog is more compact and "friendly", but in my opinion it was oversimplified. So I had to choose a compromise.

One of the things that annoyed me about Premiere was that you had to scroll and read the settings as you kept choosing, and if you didn't like one, you were presented with an overcomplicated screen of settings.

I managed to include all the important settings in one page, and they'll change as I switch between presets. If i choose "custom", the settings will become read-write (they should be read-only and greyed out), so I can just type everything right away.

I also managed to make them pretty and not bloated. This took me a while, but putting the video settings at the left and the audio settings at the right, just did the trick.



I would also like to thank the wxFormBuilder developers for making such a great tool.