For my portfolio, I chose to create a showreel. I recorded the work I thought would be relevant using Blender a screen recording. Then, used premier pro to stitch it all together. I included everything from 3D modelling to game development and motion capture. I think the selection was overall satisfactory, but I would like to have even more work there in the future. But I guess that is the natural progression of things. Overall, I am very happy with the showreel I created and I think it is a good reflection of my work and me as an individual.
Secondly, I created a website through Cargo and posted my work there. Meaning, this would be the platform I use to share my portfolio. I also included links to my Instagram, Youtube channel and Linkedin. My CV is also there, as well as my old projects that despite them not being the main focus of the website, I thought were still relevant to show my diverse set of skills. However, I didn’t include everything. I did a lot of graphic design that I would love to post as well.
Aesthetics-wise, I love the way the website looks. It is this sort of play on a website being a desktop and the work is accessible through icons that simulate those from computers. I thought this was not only unique but also a smart way to show how my work is related to technology and the fact that I’ve embraced digital mediums.
We were very happy with the final result! There are things that can be improved in terms of quality and overall polishing. However, it ended up looking very crisp and the switch from Blender to premier pro had little to no impact on the 3D effect of the characters in space. This was definitely our biggest concern, which actually became our saviour. Otherwise, having a rendered video would have been impossible. I also really appreciated the fact that we included a 2D smartphone experience for the viewers during our presentation. Even though it wasn’t with VR headsets we felt everyone was equally immersed in the experience and without a doubt added a layer of one-to-one experience that was more personal and interactive.
Final Corrected Version
After talking with Manos, I made a few changes to the video, mainly to the speed of the air character and the position of its video so that it wouldn’t leave such a big space between itself and the fire character. For the particle character, I tried to change its colour in After Effects but it wasn’t working properly because the character was too dark to be recognised as a separate mesh. I did follow a couple of tutorials but in the end, I couldn’t do much to change it.
Regarding the remaining characters, we didn’t change anything. The rendering would take too long and both Ana and I knew this wouldn’t be a realistic/necessary approach.
During our first group sessions, we went to LCDS and drafted ideas and started to record videos. We then experimented a bit with the AI Rokoko tool and created the very first clips that we used as a reference for this project.
Ai Rokoko software and limitations
We noticed right away that Rokoko Studio has some clear limitations. The animation looks very glitchy and the movement doesn’t look very natural or fluid. However, our plan is to use the mocap suits to get the best possible results.
Communication and Role Distribution
Our chosen method of communication was Instagram. We created a group chat and added everyone. This way, we were able to share our ideas, development and insights during break and whenever we didn’t have sessions together but still required each other’s opinions.
As for role distribution, we assigned the characters/elements to each of our group members as follows. The dance students will also be responsible for creating their own choreographies and we will leave all interpretation to them. Our role will mostly be the interpretation through aesthetics.
Between me and Ana, I would do the water, earth and post-processing, while she does air, fire and the human. Each one will be responsible for the rendering and after each model is complete, I will add all of them to the same file and render a 360 video.
The Characters and their Movement
For each character in our project, we wanted to depict a human interpretation of how each element would move. To achieve this, we let each dancer decide what element they would be based on personality. For example, Vee, the most extroverted and energetic person in the group became fire, which totally fits her overall behaviour.
Afterwards, we let them explore the movement for each element, as long as it made sense and was cohesive to its physical properties. Water should flow smoothly. Air should look free and light. Earth is solid and robust. Fire is wild and unpredictable. The human character would be up to the dancer because it could be a representation of themselves as a person and therefore, the representation of a human being.
Software and Tools
As I’ve previously mentioned, we are using Rokoko Studio to create some rough character animations. After that, we will export those to Blender and retarget the animation to a Mixamo character and we will then add all the effects and render everything. Finally, we will use Premier Pro to add the sound effects and any other post-processing elements we deem necessary.
Rokoko plug-in for Blender – Video Tutorial
Update
Another change of plans. I explain it in more detail below but we will also be using the Mocap Suits since they are finally working! The movement looks so much smoother and we can finally produce something that looks more professional.
Experiments
I started experimenting with my characters a bit. I followed some Youtube tutorials and adapted them a bit to what I thought looked better. For the Water element, the baking took half an hour and the rendering for a 15-second video took almost 2… I will not be able to do this many times so I have to figure out a way to cut down on the rendering time. Ana’s also had similar problems, unfortunately.
We have been discussing this and we think it would be better to render everything at LCC and split the work among 10 desktops. This way, the rendering will be way faster and hopefully, the desktops are more powerful than our laptops.
For the Earth character, the experimentation process was much smoother. For the body, I added some rock models I got from Sketchfab. As for the grass, I used the hair particle system, which created some really interesting-looking renders. I used another Youtube tutorial to help with this process which I will link below.
I’d also like to mention that after having watched and experimented with all these methods and tutorials, I realised I can pretty much attach anything I want. The particle systems are very intuitive and not as complicated as they look. I look forward to experimenting a bit more with these tools in future projects.
Youtube tutorial 1
Youtube tutorial 2
Green Room Recording
Thank God the Green Room was open and the Mocap Suits are finally working! We were able to record all the choreographies with the mocap suits and the movements look so much better! It was also super fun to see LCDS students enjoying themselves and experimenting with the mocap suits!
Final Models
Modelling the characters again was very easy and after having done all those experiments, everything felt intuitive and natural. However, I made some major changes to the Earth character. the previous model looked a bit weird and I wasn’t very happy with the result. I think it looks a bit silly… Therefore, I remade everything and instead of using the hair particle system for the grass, I used a grass 3D model from Sketchfab and used it as a particle spread throughout the character’s body. I also added a couple of flowers to make it a bit cutter and really get that feeling of nature emanating from him. Finally, I decided to use Mixamo’s character mesh and imported a rock/dirt-like material that worked very well aesthetically and really brought it all together.
The Water element pretty much remains the same. I changed the colour slightly and finally found a way to parent the liquid mesh to the character so that it moves as the character dances, without changing the height, meaning only x and y moved, or side to side.
Water Model Update
When adding the lights, I noticed I couldn’t really see anything because the background was black and the water, was transparent. After playing around a bit, I found a really cool solution. I attached two orbs of light to the character’s mesh by parenting them (just like I did for the mesh’s cube) and it created this really cool effect as if was part of the character all along.
Rendering Issues
Where to start… We finally finished all the character’s meshes and models, and after hours of work, we realised that we can’t render everything in one blender file. As soon as we opened it, the desktop crashed. We had to toss that idea and find a solution. At least if we wanted to keep the 360 video’s original idea.
I suggested we rendered each character individually, in their own respective file. And then, I’d stick all the videos together in Premier Pro by creating a 360 video and importing all the choreographies and 2d videos. By turning them to sphere objects, adapted for VR, this would technically be possible. And honestly, we don’t have any options left so this will have to do. Fingers crossed!
Time/Aesthetics update
After some discussion, we’re going to be doing 45 seconds of video. This is already a lot but we think it should be more than enough to showcase our work and still be manageable within the limited timeframe that we have. This is equivalent to 1350 frames at 30 fps.
Secondly, we had to settle on the same settings for all the characters. This means, the same camera angle, background and render settings. This was actually not a problem. Ana created the background, all black with a reflective floor just like we had planned. And I created the camera angle and format that became the standard for all 5 videos.
Water Rendering
This model gave me so many issues. For some reason, once I started rendering, the water would just fall. As in the water, the character would dance and all of a sudden the water decided to behave like water and become a puddle, even though I had baked the mesh multiple times. It was actually very funny but frustrating nonetheless. But this was due to me trying to render it in the same file as the other characters. We later figured that file was probably corrupted since it kept giving us weird renders and bugs.
Furthermore, the rendering was going to take too long, to the point LCC would have closed by the time it was done creating all those PNGs. I had to find a solution. After watching countless Youtube tutorials on how to reduce rendering time, I came across one that finally worked! I changed I couple of settings and took notes so Ana could do the same to her files.
Afterwards, I tried distributing the file between 2 desktops to cut down the rendering time. However, this was not possible. Due to the baked mesh and the physical properties I couldn’t simply render from one frame to a different one, since what result looked completely different and in the end I wouldn’t be able to stitch the sequence together. This meant I had to render everything in one go, sadly.
I gave it a go on my laptop and to my surprise, it was so much faster than the uni’s desktops! I am so glad I switched back to my laptop. It literally took 40 minutes to render the water animation. I was pleasantly surprised.
And with that, the first video was complete…
Earth Rendering
Since we had fixed almost all issues regarding the rendering process, this character was, once again, very easy to work with. I simply left it to render throughout the night, and once I woke up, it had finished processing my second character.
The second video was complete!
Human Character Issues and Rendering
Even though this was not my original task, we were having a lot of problems with rendering time still so Ana sent me the Human file so I could render it on my laptop. However, when I opened the file, once I started the animation, the character’s mesh would not move. Even rebaking it didn’t work. Then I noticed that the character was in the shared file (aka corrupt file) and since it had originally been modelled there, I had to redo it. Ana was rendering Fire and Air, which was already taking ages so I had to fix this issue because I’d finished everything else and my laptop was free to work.
I had to redo the entire particle system and added the same definitions Ana had picked. This was actually quite simple since the particles were just spheres with a randomized flowing path. Afterwards, I rendered it, Which took almost 6 hours…
But with that, the third video was complete.
Post Processing – Premier Pro
The final step was to stitch all the videos together. This was hands down the easiest step in the entire project and there is honestly not much to it. For the background, I simply rendered a 360 image of our blender background. Then, I added all the 2D videos and changed them from plane to sphere. Finally, we added the soundtrack and exported it as a VR 4k video and uploaded it to YouTube.
Sound
For the song, we wanted something more conceptual and abstract that was able to capture the fluidity as well as the robotic and somewhat technological aspects of the piece. We came across an extremely interesting track that, in my opinion, fit the video perfectly. There is a natural feeling that you get. However, the beeping in the background creates a mechanical/hospital-machine sound.
Presentation
I think our representation went quite well. All the group members that were present, were able to engage and talk about their ideas and roles through the project. I also think the fact that we had a final product to present, helped gain everyone’s attention. It felt as if we accomplished what we set out to do and we are really proud of that.
The first step to this project was coming up with an idea that could be easily incorporated with choreography whilst having a strong concept. My immediate idea was to create a choreography with the 3 states of matter, since it is physically impossible for them to dance in our physical world, by themselves. As humans, we are a combination of these 3 states of matter and not just one. The same goes for all living beings on the planet. Therefore, I want to use a digital medium (VR) to help create that otherwise impossible vision.
I decided to pair up with Ana for this project since we have a similar concept and have never worked as a group before. Our vision revolves around the three states of matter, and how they move in space. Since they make up human beings, it is almost like deconstructing what makes us human and therefore deconstructing what makes dance.
It would be interesting to see how dance can interpret the physical states’ properties and interactions. To do this, we will select some students from LCDS and hopefully create something great.
Our Group
Ana – LCC
Margarida – LCC
Lewis – LCDS
Juliette – LCDS
Ava – LCDS
Vee – LCDS
Update
There was a slight change of planes regarding our concept, after our first session at LCDS. Since we have 5 dance students, we had to find a way to incorporate the three states of matter into something a little bit more inclusive of the current number of group members we have. Therefore, we want to instead incorporate the idea of the 3 states of matter with the 4 elements + 1 extra element that ties them together. We thought this would still include the first idea, while also exploring an extremely interesting and inherently human concept, the 4 elements. We also thought the idea of the 3 states of matter was very factual and somewhat cold. However, having them be the four elements instead, adds some warmth and familiarity to our concept.
We would like to explore their movement through space as well as possible interactions. As a result, Ana and I started discussing how this could happen and what medium we would use for our final presentation. To better illustrate what we came up with, I drew a storyboard.
For our fifth element, we later decided to go with something that depicted a human being, but at the same time, not quite. Since the idea of the four elements is merely a human-created concept, our existence ties all of these elements together. Furthermore, their physical states are all part of our existence as physical beings, therefore we can also suggest that they somehow represent us or make us whole. Either way, we will include this final character. But instead of human skin or a general human figure, we want something that we can play around a bit more (special effects-wise). So, this human would be more of a metaphysical representation, rather than an actual person.
Idea – Example
Medium
We pretty much came to a consensus right away regarding the medium we’d use for this project. We feel that a VR video will be the best option. Our plan is to have the dancers surrounding the viewer in a circle and each element will be performing their choreography at the same time. This will create a very immersive effect which also allows the viewer to get an even closer look at each character’s properties and movements throughout the performance.