There are a few SO questions related to frame-by- frame animation (e.g. frame-by-frame animation and other similar questions), however I feel mine are different, so here it goes.
This is partly a design question from someone who has little experience with ios.
I'm not sure frame by frame is the correct description of what I want to do, let me describe it. Basically, I have a "script" of an animated movie, and I would like to play this script.
This script is a json file that describes a set of scenes. Each scene has several elements, such as a background image, a list of actors with their positions, and a background sound clip. In addition, for each actor and background there is an image file that represents it. (this is a little more complicated - each actor has a "behavior", for example, how he blinks, how he talks, etc.). Therefore, my job is to follow the script data, referring to the participants and the background and with each frame, place the actors in their assigned position, draw the correct background and play the sound file.
A movie can be paused, cleared forward or backward, similar to YouTube player functions.
Most of the questions that I saw that relate to frame animation have different requirements than I do (I will talk about some additional requirements later). They usually suggest using the animationImages UIImageView property. This is great for animating a button or check box, but they all assume a short and predefined set of images to play.
If I had to go with the animation, I would have to pre-create all the images in front, and my pure assumption is that it will not scale (think about 30fps in one minute, you will get 60 * 30 = 1800 images. Scrub and pause / play in this case seems complicated).
So I'm looking for the right way to do this. My instinct, and I learn more when I walk, is that there are probably three or four main ways to do this.
- Using Core Animations and defining “key points” and animated transitions b / w these key points. For example, if an actor must be at point A at time t1 and point B at time t2, then all I need to do is to revive what is in between. I did something similar in ActionScript in the past, and it was good, but it was especially difficult to implement the scrub action and keep the synchronization so that I was not a big fan of the approach. Imagine that you need to pause in the middle of the animation or scrub to the middle of the animation. It is doable, but not nice.
- Set a timer, say, 30 times per second, and at each tick consult the model (the model is a json script file, as well as a description of the participants and background) and draw what you need to draw on this time. Use the Quartz 2D API and drawRect. This is probably a simple approach, but I don’t have enough experience to tell how well it will work on different devices, probably a processor one, it all depends on the number of calculations that I need to do on each tick, and the amount of effort Ios needs to draw Total. I have no clue.
- Similar to 2, but use OpenGL for drawing. I prefer 2 b / c APIs easier, but maybe resource-saving OpenGL is more suitable.
- Use a game framework like cocos2d , which I have never used before, but seems to solve more or less similar problems. They seem to have a good API, so I would be happy if I could find all my requirements that they answered.
In contrast to the requirements that I just described (play the movie with its "script" and a description of the actors, backgrounds and sounds), there is another set of requirements -
- The movie must be played in full screen mode or in partial screen mode (where the rest of the screen is for other controls).
- I start with the iphone, naturally the ipad should follow.
- I would like to create a thumbnail of this movie for local use of the phone (display it in the gallery in my application). A sketch can only be the first frame of a movie.
- I want to be able to "export" the result as a movie, which can be easily uploaded to youtube or facebook.
Thus, the big question is whether any of the proposed 1-4 implementations (or others that you could suggest) can somehow export such a movie.
If all four do not work in the task of exporting films, I have an alternative. An alternative is to use a server that launches ffmpeg and which accepts a package of all the video images (I would have to draw them on the phone and load them into a series according to their sequence), and then the server will compile all the images with their soundtrack to one movie.
Obviously, to make everything simple, I would prefer to do it without a server, i.e. To be able to export a movie from iphone, but if this is too much to ask, then the last requirement would be at least the ability to export a set of all the images (key frames in the movie), so I can link them and upload them to the server.
The duration of the film should be one or two minutes. Hope this question is not too long and it’s clear ...
Thanks!