Some New Monsters in Blender

I added a couple of new monsters to my game: a skeleton guy and a werewolf character.

e02_wire_front  e02_wire_perspe02_texturede02_w_e01_walk_anim

I originally thought that modeling and rigging the skeleton would be easier than modeling a skinned mesh, it turned out to be a lot more difficult. I modeled each bone with the intention of simply parenting each rigid bone model to the armature’s bone. I hoped to eventually have the skeleton break apart into its component bones when the hero character strikes it. Unfortunately, at least for this first pass, it was proving too time-consuming to do this way, and this means that the underlying code base would require some special-case implementation to support monsters breaking apart. For the time being, all bones are combined into a single mesh, and thereby skinned to the armature. As far as the texture goes, I know it looks pretty horrible as it is. It’s just a temporary test texture, and I’ll be improving that as I get closer to launch.

The werewolf, on the other hand, was very quick to model, especially with a model sheet that exposed a lot of the detail and dimension for me.

e03_mesh_fronte03_mesh_side

e03_mesh_persp_handse03_mesh_ortho

e03_mesh_persp_back_handse03_mesh_persp_jaw_and_taile03_base_tex

You’ll notice that in the first image, the model sheet drawing isn’t exactly symmetrical. I just worked my way around this and did my best to estimate the vertex positioning. The lower jaw is a separate model that is attached to the chest bone. This way, I can animate the werewolf biting the hero character, because the neck bone will be rotated away from the chest bone. I could have added an additional bone to the armature specifically for the lower jaw, but again, this would require special-case code that I didn’t intend to implement. I’m pretty happy with this model, it’s design, vertex and face count, and how easy it was to build and animate. It also looks like the most menacing character in the game so far.

So, my game is coming along. I’m running into numerous road blocks on the way, but that’s game development for you. I’m hoping to wrap up around mid October, so I’m excited, a little anxious, and just working on the project whenever I can.

Good luck on your own projects, and as always,…

Make it fun!

 

  • Blender 2.74
Posted in Art, Dev | Leave a comment

More Game Progress

Here’s a video of some progress on gameplay from a few weeks back. It’s gettin’ there, slowly but surely!

Posted in Art, Design, Dev, Programming | Leave a comment

Signing Into Android Google Play Game Services with Unity

Google Play Game Services

Google Play’s Game Services provide mobile apps with social game elements like leaderboards, achievements, and quests, which can give more value to the players of those apps. Since I’m using Unity for my current project,  I’m using an asset called Android Native Plugin from Stan’s Assets (ANP). Personally, I am not sure of how many other asset solutions there are that do the same thing, but at the time that I needed it, this asset was pretty much staring me in the face with its high number of customer reviews at pretty much 5 stars, its reasonable price, and the many testimonials mentioning the great support. So I went with it.

I actually used ANP first on Number Crunchers, for banner ads on the menu screens, and I also had some luck with Game Services login on that app, but decided against fully implementing the solution at that time.

Signing In

While Stan’s Assets and the Google Play online documentation are quite thorough in describing how to set up game services, the natural progression in software development is that shit happens, and not all the answers to your problems are immediately available. There have been some ongoing discussions on the Unity forums regarding connection issues, and here are some potential solutions and suggestions that I read about there to get this working, in no particular order.

  • The first thing suggested to me was to go to Edit > Project Settings > Quality Settings, and to set V Sync Count to either “Every V Blank” or “Don’t Sync”. Apparently some users have run into issues if this is set to “Every Second V Blank”.
  • The documentation for ANP still mentions that you can test on an emulator (not sure if this is still true), but it’s almost always a good idea to use an actual device to test on. Deploying admittedly takes a long time, but at least the results are far more accurate. The Google Play documentation also mentions using a debug certificate, and linking a debug version of the game in the Developer Console, but when I tested this, I did not get this to work, so you’ll have to deploy a signed APK.
  • Another suggestion was to delete the entry for the game in Game Services in the Developer Console, and then re-enter it, with a new keystore and client certificate. It sounds like a bit of work to redo, but it could possibly be the culprit.
  • If by chance, you’re upgrading from a previous version of the asset, first, back up your project. I was upgrading from 6.3 to 6.8, and I failed to fully read the “Update Best Practices” page, which probably would have saved me some headaches. Instead, I hand-picked all the files and folders to delete from my Unity project Assets folder, and installed a fresh copy of ANP. This resulted in a broken asset similar to what user D0R1N encountered here. I tried wiping the files clean again, and got as far as publishing and deploying my APK file, but I was still unable to connect. It was only until I again removed ANP using the “Remove” button inside the plugin that I got the asset working correctly.
  • Lastly, there have been suggestions (starting from user petediddy) about using a coroutine to wait for the connect() call to finish. In any of your Monobehaviours, particularly on init,
    void Start()
    {
        StartCoroutine ("StartGooglePlay");
    }
    
    IEnumerator StartGooglePlay()
    {
        if (GooglePlayConnection.state != GPConnectionState.STATE_CONNECTED)
        {
            GooglePlayConnection.instance.connect ();
            yield return new WaitForSeconds (1f); //pause for 1 second
        }
    

    This isn’t directly tied to disconnection problems, but it has been mentioned for those who have had problems with sign-in latency and delays. I decided to keep this solution around in my code until a more stable release or workaround has been established.

I can’t guarantee that these solutions will work for everyone, but it’s what I gathered from the thread. I’m sure I missed a few other gotchas for the setup process, and am willing to add them here if anyone gives me a ping about it. There are quite a few landmines that can get you into trouble, so be careful ;).

Good luck!

 

  • Unity 5.1.3p2
  • Android Native Plugin 6.8.1
Posted in Dev, Programming | 2 Comments

Zombie Model and Texture

Here’s some work in progress on my zombie model that’s closer to my production target. I’m still getting through the production grind!

zombie_prod

Posted in Art, Dev | Leave a comment

Number Crunchers Update!

I’m excited to see that more people are starting to try out Number Crunchers from the Google Play Store!

This encouraged me to update it a bit with a couple of things that I wanted to get in.

1.02 Release Notes

* Added a warning flash with 3 seconds to go before a safe zone disappears.
* Reduced the amount of time to wait for an extra life to 90 seconds.
* Fixed overlap for Game Over and In-Game Menu screens when backing out to Main Menu.
* Added background color selector to the in-game menu.
* Added error checking for an issue reported for RemoteException code -1001.
* Added Credits panel.

Here’s a screenshot of the background color selector:

NumberCrunchers_background_color_Screenshot_2015-08-03-12-23-16

 

The Number Crunchers update is now available on the Play Store.

Posted in Announcements | Tagged | Leave a comment

The Grind

It’s been over a week since I’ve posted, which is pretty uncharacteristic of me, judging by my previous posting history. Well, I’ve been caught up in the “grind”. Basically, that’s what I’m calling the Production stage of my current game that I’m developing. Since I usually like to give some perspective of what I’m doing in respect to a more general picture, I thought it was a great time to discuss the Game Development Cycle.

Before diving into what the game development (aka “gamedev”) cycle is, I’d like to step back for a bit, to describe game development in terms of overall thought processes.

Creativity

An idea for a game starts with that creative spark. It’s difficult to define, but when you are exposed to something new, or you dig up some thoughts or memories, something in your mind just flips a switch. Whereas some people form an idea for a short story, novel, play, sketch, painting, song, or movie, others come up with an idea for a game (Games as a creative work is a topic that deserves its own article, so it’s out of scope for this one). You can tell that I haven’t studied psychology at all, because I’m pretty sure there is terminology and explanations for exactly what I just described; something like neurons, or firing neurotransmitters across synapses, etc. <shrug>.

Anyway, some creative ideas are good, some are great, some are mediocre, and others are just plain terrible. Either way, to bring a video game from idea to reality is quite the challenge, especially since some of those ideas might be so off-the-wall or so overly imaginative, that they may go beyond the scope of what’s possible with current technology.

Software Development

Video games are software. That’s the bottom line, and the basis for the entire development process. Without electricity and some sort of computer hardware, video games are vaporware. With that in mind, it makes sense that you can apply general software development processes and principles to making video games. Right? Well, yes and no. The challenge with video games is that you are creating an artistic work. So for game development, you really need to incorporate a combination of software processes and creative, artistic processes to have any chance of realizing the vision of your game.

The Game Development Cycle

The game development cycle is the process of making a game from start to finish. I’ll give a brief description of each stage of the cycle, but I don’t want to go into too much detail. Instead, I’d like to describe them in terms of the feelings and emotions that a developer might have. So here’s the list as a snapshot, followed by more detail about each stage.

  1. Game Concept
  2. Game Prototype
  3. Art Concept
  4. Preproduction
  5. Production
  6. Alpha
  7. Beta
  8. Final
  9. Release

GAME CONCEPT

Ah, this is the most fantastic and exciting part of the development process. After you get an idea for a game, you start coming up with all sorts of ideas that spawn from that one idea that would make the game more interesting. You conceptualize your game by trying to uncover every possible detail that you have that sounds like it would make the game great.

And of course, instead of keeping all of those ideas in your head, you had better document them somewhere. Speak it into a voice recorder, draw it on a white board, piece of paper, or napkin, talk it over with someone to bounce around more ideas, or more traditionally, write them all down in a game design document. You can further formalize this into a Powerpoint presentation.

GAME PROTOTYPE

In order to test out your ideas, your first attempt at building the software would be a prototype of the end product. This is where you can take all of those great ideas from your game concept (aka “proof of concept”), or at least the high risk features, and put them into a playable form. Some game designers even create prototypes of gameplay features with pen and paper, without any software. But to ultimately get a feel for how your game will play, it’s probably best to write a prototype in software. This is all the experimental stage, where you can figure out if your ideas will work, at a greatly reduced cost of both your time and money. It’s also during this stage that you and/or your team can start evaluating existing technology that you can leverage to make development faster or more streamlined, such as game engines, art/audio authoring packages, and other game development libraries and plugins.

Sometimes, you won’t know when to stop prototyping. When can you determine that the prototype has proven to you and/or your team that it is actually a fun game? That’s for you to decide, and there is no rule for this. It’s typically used to mitigate any sort of game design or technical risks before moving onto the full-blown production, so if you feel that these risks are addressed, you can probably feel safe to move on.

ART CONCEPT

Along with the game prototype, concept art explores what the game will look like. Developing an art style is highly dependent on what sorts of emotions the game is supposed to evoke. The game ideas and prototypes can help identify the art style, as well as the genre that you decide to classify your game. Generating a wide range of concept art will also help to inform the development team about how much work it will take to create a particular style, and also alert the team to any challenges that the style may present to the technology that the team is using. Concept art usually takes the form of sketches, drawings, paintings, storyboards, and even mockup videos or animatics.

This stage is usually done along with the game prototype, but not necessarily in the game prototype, although that is not unheard of. These exploratory exercises stimulate even more ideas to help narrow in on the game’s identity.

PREPRODUCTION

You can argue that all the concept and prototyping stages can be considered Preproduction, but I separated it out here because concept and prototyping are so important on their own. Not only that, rolling into preproduction after finishing the concept/prototyping stages suggests that the full development project has been given the green light to proceed. Sometimes, after finishing concept and prototyping, you might just determine that the game wasn’t all that great after all, and it wouldn’t be worth it to proceed in making it a full blown production. In which case, you just saved a lot of time and money.

But, if you and/or your team are still excited about the game idea, you head into preproduction. You start integrating and experimenting with the existing technology that you evaluated during the prototyping phase, start to stand up core systems in code, write tools to streamline the process of creating artwork and other game content.

At this point in development, you’re still pretty fresh and excited about the game. You start to see core gameplay mechanics getting implemented. Although the results are still pretty rough around the edges, you feel that the prototyping stage has proven itself, because the game is very fun, and with more time, you will run more passes at the gameplay to refine the mechanics and polish them. The systems are fairly disjointed at this point, if they’re even built, and still need integration with each other.

PRODUCTION

Now, all the core systems are in place, the content authoring (art production) pipeline is in place, and it’s time to pump out all that artwork and gameplay. Authoring content for the game can in some ways, be monotonous, in that you are building things, like different types of game units, powerups, characters, items, etc. and sometimes there could be no end in sight. This is why I call this stage “the grind”. This stage takes up the longest development time in the cycle. You simply have to power through authoring the content. Occasionally, you’ll run into bugs that generate friction in the authoring process. And this can greatly hamper development. If, by chance, you do run into a serious issue that threatens the success of the project, or the project’s deadline, you have to re-evaluate continuing with the project. For the most part, you’ve invested so much time and effort into it, you have to eat the cost of however long it will take to solving the bug or issue, work around it, or compromise on the design. In fact, it may even be a game design oversight. As you are working through feature development, perhaps a game design conflicts with another design that no longer makes the game fun. These are the sorts of obstacles that require course correction and really threaten the completion of the game. But it really is important to power through it. If the game isn’t fun, it’s time to try and find other gameplay mechanics that you can substitute to make it fun.

ALPHA

“Alpha” is a software development term that varies across different software industries, and even across different game development studios. The Alpha stage that I am familiar with is when the game can be considered “feature complete”. This means all game features and systems are built, all artwork has been included in the game, but there are still a lot of major bugs to fix, and artwork to polish and improve. Declaring Alpha for your project is actually a cause for celebration. If you’ve made it through the production process, and the game is playable and is still fun, but just needs that extra polish to make things “solid”, you deserve a breath of fresh air. It’s like taking a large breather before going for that final push.

BETA

The “Beta” stage in software development is really the last step before calling it “finished”. This is the time to fix all remaining bugs, tune, tweak and balance gameplay to make it as fun as possible, and add extra polish. This is also a grueling stage of development, since any remaining bugs may have very low reproduction rate, or are very complicated to fix, which can potentially risk breaking the rest of the game, if fixed incorrectly. And polish can sometimes be difficult to introduce at this stage, because some polish tasks can end up taking a long time to implement, or after a little investigation, reveal itself as a complicated feature rather than something that can simply be polished.

Generally, it’s a nerve-wracking stage, but the finish line is in sight. You get very critical about all the work you’ve done, and you want to make sure everything is perfect. So you can actually get pretty stressed out, but you feel that it’s been worth it.

Also, if you haven’t already, this is a great time to prepare any marketing material for your game. The game content is pretty much done, so any sort of marketing screenshots, videos, or descriptions, will better reflect the game at release time. It’s arguable that marketing should actually start way earlier than this, and I actually agree, especially for independent developers. However, any sort of marketing you do before the game is complete will not reflect the final product. Nowadays though, some gamers enjoy watching a game in development, and it helps to create buzz for the game before release.

FINAL

You can finally call your game “done”, but the Final stage is there to do any last minute testing in regard to compliance and integration with third parties, such as online stores.

RELEASE

And the game is finally released! You finally put a new product on the digital store shelves. It’s a time to rest, and also a time to work hard =). Basically, it’s at this time the marketing machine for your many hours/months/years of hard work kicks in to let as many people as possible know about your game. But marketing is out of the scope of this post, as you might imagine.

As you can see, the game development process is very much a journey. Making something out of nothing takes an incredible amount of time, effort, and dedication.

Realities

I’ve left out a lot of details about each stage of the process, mainly because each stage is so jam-packed with its own topics, that it doesn’t make sense to discuss it here. Not only that, the dev cycle that I outlined above is only a generalized look at the gamedev process. Some game developers have even established production frameworks that modify these stages to their own work methodologies. Agile software development comes to mind, and I’ll eventually write something about that too.

Game development is an iterative, creative process. Sometimes the end result doesn’t reflect exactly what’s in your mind’s eye. Sometimes, during the process, you get stimulated by other ideas that come about by something that you created, usually an interesting game mechanic or art style. Other times, you’re blocked by technological limitations or simply by the lack of time required to implement that technology, forcing you to find a way around, out, or through that problem. This incurs unexpected cost, in terms of time and/or money. Existing code and/or artwork has to either be reworked to match the new design, or thrown aside to cut any further losses. It’s no wonder why large game titles nowadays get their release dates delayed, or get extremely buggy launches.

New Trends

A few key things to note. Software is broadly heading towards the idea of “Software as a Service”. Games used to be released, supported for a couple months to patch up any serious bugs, and then development resources are moved on to another project. But now, games are largely supported long after it is initially released. Especially with outlets that allow independent developers to release their games directly to their customers, it’s more important to have the mentality that whatever you release, you have to support and maintain the game for your customers, and keep them engaged with your game and the content that each one offers.

Keep Digging

As I mentioned in the beginning of the post, I’m deep into the Production stage of development. It’s a long haul. I’ve got to grind through it, and every game developer goes through it. But that’s the cost of completion. I have no idea how long it will take me to finish. Originally, I thought I’d be done by the end of August, but now I think mid September, and even then, that’s pretty optimistic. But if you are to ever finish developing a game, you really have to set your goals, and try to set them even higher than you expect yourself to accomplish. Either way, I know I will eventually finish.

So, I guess I just wanted to inform game development newbies of what to expect during a gamedev cycle, and remind experienced developers what they go through to get to the end. That’s game development. It’s our passion, our labor of love. If you really want it, you’ll get it done.

Make it fun!

 

Posted in Dev, Thoughts | Leave a comment

Blender Rigify Bone Selector Panel

I’ve been ramping up my animation production lately, and one of the annoyances that I’ve run across while using Blender’s Rigify armature is that it can be difficult to select some of the control bones for a few reasons. In Wireframe display mode, trying to find the proper line segment to click on can be tedious. It helps to disable selection of meshes in the Outliner, but that still doesn’t help not being able to see the appropriate object to select. rigify_model_wireframeIn Solid, Texture, or Material display mode, again, it does help to disable mesh selection, but when the control bone is completely hidden by the mesh, as is the case for my character’s head, you have to start guessing where to click.

rigify_model_texture

The best option by far, suggested to me by reddit user WhatISaidB4, is to enable X-Ray mode in the Armature panel.

rigify_model_armature_xray

Definitely an improvement over the other viewing options, but as you can see, it’s still a bit cluttered when having to make rapid selections.

So I wrote up a Python script to add a bone selector in the Properties panel of the 3D View.

rig_bone_selector

The top “Use Front View” button toggles the view orientation, so that when you’re viewing the character from the front, the left-hand buttons select the right side of the character. If the option is disabled, then the left-hand buttons select the left side of the character.

The buttons outlined in boxes are the inverse kinematics (IK) bone selectors, and everything else refers to a forward kinematics (FK) control bone.

This panel differs from the Rig Layers panel, in that this is for selection, whereas the Rig Layers panel is for toggling the visibility of the rig layers in which each bone resides.

Below is the entire script. You can click on it to download it if you want, and you can just plop into your /Blender/<ver>/scripts/addons directory for Blender, and enabled in the File > User Preferences... > Add-ons window.

Download

#  UTW_RigBoneSelect

#Copyright (c) 2015 Under the Weather, LLC
#
# Permission is hereby granted, free of charge, to any person obtaining a copy of this software
# and associated documentation files (the "Software"), to deal in the Software without restriction,
# including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all copies
# or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
# INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
# PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE
# FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
# ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

bl_info = {
    "name": "UTW Rig Bone Selector",
    "author": "Under the Weather, LLC",
    "category": "Object",
    "description": "UTW Rig Bone Selector",
}

import bpy

class UTW_RigBoneSelectPanel(bpy.types.Panel):
    bl_space_type = 'VIEW_3D'
    bl_region_type = 'UI'
    bl_label = "Rig Bone Select"
    
    bpy.types.Scene.use_rig_front_view = bpy.props.BoolProperty(name="Use Front View", default=True, description="Enable if the character is facing frontwards.")

    def side(self, left_side_button, context):
        if left_side_button:
            return "R" if context.scene['use_rig_front_view'] is 1 else "L"
        else:
            return "L" if context.scene['use_rig_front_view'] is 1 else "R"

    def draw_leg_controls(self, col, context, side):
        col.operator("utw.selectrigbone", text="Thigh").bone_name="thigh.fk." + self.side(side, context)
        col.operator("utw.selectrigbone", text="Shin").bone_name="shin.fk." + self.side(side, context)
        box2 = col.box()
        box2.operator("utw.selectrigbone", text="Knee").bone_name="knee_target.ik." + self.side(side, context)
        box2.operator("utw.selectrigbone", text="Foot").bone_name="foot.ik." + self.side(side, context)
        box2.operator("utw.selectrigbone", text="Heel").bone_name="foot_roll.ik." + self.side(side, context)
        col.operator("utw.selectrigbone", text="Foot").bone_name="foot.fk." + self.side(side, context)
        col.operator("utw.selectrigbone", text="Toe").bone_name="toe." + self.side(side, context)
        
    def draw(self, context):
        layout = self.layout
        col = layout.column()

        row = col.row()

        row.prop(context.scene, 'use_rig_front_view', toggle=True)

        row = col.row()
        row.operator("utw.selectrigbone", text="Head").bone_name="head"

        row = col.row()
        row.operator("utw.selectrigbone", text="Shoulder").bone_name="shoulder." + self.side(True, context)
        row.operator("utw.selectrigbone", text="Neck").bone_name="neck"
        row.operator("utw.selectrigbone", text="Shoulder").bone_name="shoulder." + self.side(False, context)

        row = col.row()
        col1 = row.column()
        col1.operator("utw.selectrigbone", text="UpperArm").bone_name="upper_arm.fk." + self.side(True, context)
        col1.operator("utw.selectrigbone", text="Forearm").bone_name="forearm.fk." + self.side(True, context)
        row.operator("utw.selectrigbone", text="Chest").bone_name="chest"
        col2 = row.column()
        col2.operator("utw.selectrigbone", text="UpperArm").bone_name="upper_arm.fk." + self.side(False, context)
        col2.operator("utw.selectrigbone", text="Forearm").bone_name="forearm.fk." + self.side(False, context)
        
        row = col.row()
        box1 = row.box()
        box1.operator("utw.selectrigbone", text="Elbow").bone_name="elbow_target.ik." + self.side(True, context)
        box1.operator("utw.selectrigbone", text="Hand").bone_name="hand.ik." + self.side(True, context)
        col2 = row.column()
        col2.alignment = 'EXPAND'
        col2.operator("utw.selectrigbone", text="Spine").bone_name="spine"
        box2 = row.box()
        box2.operator("utw.selectrigbone", text="Elbow").bone_name="elbow_target.ik." + self.side(False, context)
        box2.operator("utw.selectrigbone", text="Hand").bone_name="hand.ik." + self.side(False, context)


        row = col.row()
        col1 = row.column()
        col1.operator("utw.selectrigbone", text="Hand").bone_name="hand.fk." + self.side(True, context)
        col1.operator("utw.selectrigbone", text="Palm").bone_name="palm." + self.side(True, context)
        colmid = row.column()
        colmid.operator("utw.selectrigbone", text="Torso").bone_name="torso"
        col2 = row.column()
        col2.operator("utw.selectrigbone", text="Hand").bone_name="hand.fk." + self.side(False, context)
        col2.operator("utw.selectrigbone", text="Palm").bone_name="palm." + self.side(False, context)

        row = col.row()
        col1 = row.column()
        self.draw_leg_controls(col1, context, True)

        col3 = row.column()
        self.draw_leg_controls(col3, context, False)

        row = col.row()
        row.operator("utw.selectrigbone", text="Root").bone_name="root"


class UTW_RigBoneSelect(bpy.types.Operator):
    bl_idname = "utw.selectrigbone"
    bl_label = "Bone Selector"

    bone_name = bpy.props.StringProperty()
 
    def execute(self, context):
        bpy.ops.object.select_pattern(pattern=self.bone_name, extend=False)
        return{'FINISHED'}   

def register():
    bpy.utils.register_class(UTW_RigBoneSelect)
    bpy.utils.register_class(UTW_RigBoneSelectPanel)

def unregister():
    bpy.utils.unregister_class(UTW_RigBoneSelect)
    bpy.utils.unregister_class(UTW_RigBoneSelectPanel)

#register()

I’ve seen some really slick bone selector utilities in Maya (you can do a Google image search for “Maya bone rig selector” to see what I mean), which I actually wanted to add for Blender, but I think this is about as far as I think I’m going to go. Hoping this helps streamline somebody’s Blender animation process.

Make it fun!

  • Blender 2.74
  • Python 2.7.10

 

Posted in Art, Dev, Programming | Leave a comment

Building Character Customization (Part 2)

Review

Welcome to the second part of my Character Customization blog post! The first part can be found here if you missed it, and it covers preparing a Blender3D file that contains a rig, several meshes weighted to that rig, and animations, as well as exporting that Blender file to multiple FBX files for Unity3D to use in a game. We ended up with this:

blender_export_fbx_rig_and_models

File Naming Convention

Before we go further into this, I need to mention a naming convention for all of these files. Keep in mind that this is only my naming convention. Your system may end up being different in the end, but the point is, in order for a system to be loaded into a game, it works best if the filenames follow a certain convention.

My convention for the input Blender file is:

<base name><rig>.blend

And specifically, my files are named like this:

  • HeroA.blend
  • HeroB.blend
  • etc.

The “A” is my rig identifier. And if I ever create new rigs for new characters, I’d have a HeroB.blend, HeroC.blend, etc.

My convention for the output FBX files is:

  • <base name><rig id>[rig]_rigify.FBX
  • <base name><rig id>[<character id>.<costume id>]_rigify.FBX
  • etc.

For example,

  • HeroA[rig]_rigify.FBX
  • HeroA[00.0]_rigify.FBX
  • HeroA[01.0]_rigify.FBX
  • etc.

The main thing to notice here is that one of those files is the rig file which may or may not include the animations (for my purposes, I have included them), and all other files are numerous custom meshes that would be attached to the rig. Each of these files corresponds to a layer in the Blender file, and the layer name is identified with the character string in between the brackets.

The Resources Folder

Unity provides a method for loading resources from disk, and this is the method I use to load my FBX models:

public static Object Resources.Load(string path);

The catch is, any resources loaded with this function must be in a subfolder named “Resources” anywhere in the Unity Assets directory. You can even have multiple folders named “Resources” in the Assets directory if you choose to, but that can sometimes be a little confusing.

Memory for a resource is not allocated until Resources.Load() is called. This is different from the prefab method, which preallocates memory for each prefab, which I’ll discuss below.

Loading the FBX assets from disk is a lot more like traditional game development, and how typical programming works. You get to choose how to load the assets, and when to load them.

Choosing Resources.Load() over the Prefab Method

I chose to use this method because loading up each FBX as a prefab set up in a scene has a number of drawbacks.

  • All model resources would have to be preloaded as prefabs, which uses precious memory that could otherwise be used by your actual game. Why load all those custom parts when you’re not even using them?
  • They’d have to be manually updated whenever new layers or blend files are created.
    • For example, if you were to save a prefab reference in a script called, say, CharacterPrefab.cs, you’d have to update that script manually whenever you add a new character or costume, head, or hair model.
    • Alternatively, in CharacterPrefab.cs, you can add an array of GameObjects to store all your prefabs. This way, you can modify the count in the Inspector, and then drag/drop the FBX model into each slot.
    • Both of these solutions require you to access a MonoBehaviour script in one way or another. If there was a way to dynamically load those prefabs without having any intervention with the Inspector, then it would cut down on some of the manual processing.
  • There is an existing bug in Unity that doesn’t update prefabs that contain an FBX asset that has been updated. There are some suggestions online that mention attaching the FBX model as a child of an empty GameObject, which serves as the actual prefab. This has been a huge pain for me, because I’d have to manually delete the old FBX model from the prefab, and re-drag/drop the FBX model onto the prefab. That’s lots of wasted time.

It’s worth mentioning that it is possible to implement an AssetPostprocessor to update these. The idea is that whenever the FBX file is updated, you can have an AssetPostprocessor target an existing prefab that’s associated with it, say by filename or something, and then modify the prefab all in script. But I decided not to go this route.

That should be it for all of my reasons to NOT use prefabs that reference the FBX files.

Loading the FBX files in Unity

All that talk about Resources.Load() comes down to this section. Before we get into any code we have to get these FBX files loaded into Unity. These files simply need to be copied into the Unity project’s Assets folder tree,  remembering that they MUST be contained in, or in a subfolder of, a folder named “Resources”. You can do this by either exporting the FBX files directly into the target Assets folder manually from Blender’s UI, or automatically via a series of scripts as I’ve discussed in my blog post about a Blender-to-Unity Asset Builder. Here’s an example of a folder tree that I have for my own project:

resources_dir_example

Setup

All right, let’s do some Unity stuff.

To serve as a testing ground, open up a new scene, and place an empty GameObject in the Hierarchy. We’ll attach a script to it which will do all of our resource loading.

using UnityEngine;
using System.Collections;


public class CustomCharacterTest : MonoBehaviour
{
    // Use this for initialization
    void Start()
    {

    }

    // Update is called once per frame
    void Update()
    {

    }
}

Empty_GameObject_w_CustomCharacterTest_script

We’ll implement all of the loading code in our MonoBehaviour’s Start() function.

Loading the Rig

        GameObject rigObj = null;
        GameObject rigPrefab = Resources.Load("models/heroes/HeroA[rig]_rigify") as GameObject;
        if (rigPrefab != null)
        {
            rigObj = GameObject.Instantiate(rigPrefab);
            rigObj.transform.parent = this.transform;
        }

Note that when loading from Resources, you’re loading the asset into your project, not loading the object into the scene hierarchy. You can think of it as dynamically loading a prefab to instantiate from. Since Resources.Load() can load any type of asset, it’s not always the case that it’s a GameObject.  In this case, it is a GameObject though, so after we create the prefab, we can instantiate it.

After loading the rig, you should see something like this in the Hierarchy:

rig_hierarchy

The rig’s GameObject hierarchy should be kept intact from the Rigify armature imported from Blender, except if you have the RigifyToUnity asset installed, which you should, then the hierarchy should be somewhat stripped down to the essentials as in the screenshot.

Now, WTF is that “Plane” object you ask?

plane_object

The Dummy Object

Okay, I only glazed over this in Part 1 of this post, so I apologize for that, but here is more detail about it, because it is essential for getting the skinned mesh working later on.

The “Plane” GameObject is a very simple mesh that’s skinned to the rig, and it can actually be named anything. It’s just a “dummy” object that holds a SkinnedMeshRenderer that we need for later. If you select it and check out the Inspector, you’ll see the SkinnedMeshRenderer component assigned to it.

plane_object_skinned_mesh_renderer

This mesh can actually be any geometric shape that you create in Blender (even a stick dummy if you want to make visualizing animations easier), as long as it resides on the rig’s layer, and gets exported with the rig as a skinned mesh. I just wanted to choose the simplest object I could to reduce memory consumption, but after we’re done with it, it gets deleted anyway.

BoneDebug

At this point, if you’re not visually convinced that the rig is loaded, I found a nifty asset on the Asset Store called BoneDebug. You can drop the BoneDebug.cs script onto the rig or root GameObject, and it will visualize the armature for you!

bone_debug

Utility Functions

Before we get to loading the meshes onto the rig, I need to get some things out of the way. These are a few utility functions that I use in the below code, so know that it’s there.

public class TransformUtils
{
    // Does not support case-sensitive
    public enum StringMatch
    {
        Exact,      // Exactly
        Leading,    // Leading the string
        Partial     // Anywhere in the string
    }

    // TransformUtils
    // Recursively look for the bone with specified name
    public static Transform GetTransformWithName(Transform root, string name, StringMatch match = StringMatch.Exact)
    {
        if (match != StringMatch.Exact)
        {
            string rootname = root.name.ToLower();
            int index = rootname.IndexOf(name.ToLower());
            if (match == StringMatch.Leading && index == 0)
                return root;
            else if (match == StringMatch.Partial && index != -1)
                return root;
        }
        else
        {
            if (string.Compare(root.name, name, true) == 0)
                return root;
        }

        for (int i = 0; i < root.childCount; ++i)
        {
            Transform child = root.GetChild(i);
            Transform sub = GetTransformWithName(child, name, match);
            if (sub != null)
                return sub;
        }
        return null;
    }

    // TransformUtils
    public static void ResetTransform(Transform transform)
    {
        transform.localPosition = Vector3.zero;
        transform.localRotation = Quaternion.identity;
        transform.localScale = Vector3.one;
    }

    public static void ResetChildTransforms(Transform parent)
    {
        for (int j = 0; j < parent.childCount; ++j)
        {
            Transform child = parent.GetChild(j);
            TransformUtils.ResetTransform(child);
        }
    }
}

Loading the Model

GameObject modelPrefab = Resources.Load("models/characters/heroA[01.0]_rigify") as GameObject;

This time, instead of instantiating the model prefab wholesale, we’ll “pick at” the prefab in a piecemeal manner. Essentially, we just want to load the head and hair mesh objects, and also the body mesh object. We don’t want to instantiate the rig that they are attached to in the model prefab.

Loading the Head and Hair

The head and hair are relatively simple to load compared to the body, so we’ll start there. The head and hair objects are not skinned to the armature. They are simply parented to a bone on that armature.

        GameObject modelPrefab = Resources.Load("models/heroes/HeroA[01.0]_rigify") as GameObject;
        if (modelPrefab != null)
        {
            Transform headXfrm = TransformUtils.GetTransformWithName(modelPrefab.transform, "head", TransformUtils.StringMatch.Leading);

            Transform parentBoneXfrm = headXfrm.parent;

            Transform rigHeadBoneXfrm = TransformUtils.GetTransformWithName(rigObj.transform, parentBoneXfrm.name, TransformUtils.StringMatch.Exact);

            GameObject headObj = GameObject.Instantiate(headXfrm.gameObject) as GameObject;

            headObj.transform.parent = rigHeadBoneXfrm;

            GameUtils.ResetTransform(headObj.transform);
            GameUtils.ResetChildTransforms(headObj.transform);
        }

If it’s not clear as to what the script is doing, here’s what’s going on in semi-plain English. GetTransformWithName() gets the head transform in the prefab. We also want to remember the head transform’s parent too. With the parent transform’s name, we look for the bone with that same name in the rig object hierarchy (because the rig object is what’s actually instantiated in the scene). We then instantiate the head object from the prefab, and set its parent to the bone that we spent the last few lines trying to get. After all that’s done, we reset the transforms of the newly created objects, namely the head and hair objects.

With this, you can load up the scene, and see that the head and hair objects are loaded, especially with the BoneDebug component applied to the rig.

head_and_hair_attached

I want to mention that the calls to ResetTransform() and ResetChildTransforms() are here because I couldn’t get the head and hair to look right after instantiation. I don’t think they’re necessary if you apply your transforms properly in Blender, but for some reason, they still looked wacky to me. Go figure.

Loading the Body

And now for the main event! The body mesh is skinned to the rig with a SkinnedMeshRenderer. However, in terms of the model prefab that we loaded above, this means that the body mesh of the prefab is skinned to the prefab rig with the prefab SkinnedMeshRenderer. This does us no good, because we don’t want a SkinnedMeshRenderer referring to a rig in the prefab. We want to use the rig that’s already loaded in the scene.

But how do we go about doing that?

Let’s forget about the rig in the scene for a second, and focus on getting the mesh from the prefab’s SkinnedMeshRenderer.

            Transform bodyPrefabXfrm = TransformUtils.GetTransformWithName(modelPrefab.transform, "body", TransformUtils.StringMatch.Leading);
            SkinnedMeshRenderer prefabSkinnedMeshRenderer = bodyPrefabXfrm.GetComponent< SkinnedMeshRenderer >();
            Mesh mesh = prefabSkinnedMeshRenderer.sharedMesh;
            Material mat = prefabSkinnedMeshRenderer.sharedMaterial;

We’ll hold on to the Mesh and Material here.

Static Mesh Test

To really understand what’s going on here, let’s load up the body mesh onto a standard, static MeshRenderer. Here’s the function that does the work.

    private void CreateStaticModel(Transform parent, Mesh mesh, Material mat)
    {
        GameObject bodyObj = new GameObject("body_static");
        bodyObj.transform.parent = parent;

        MeshFilter filter = bodyObj.AddComponent< MeshFilter >();
        filter.sharedMesh = mesh;

        MeshRenderer renderer = bodyObj.AddComponent< MeshRenderer >();
        renderer.sharedMaterial = mat;
    }

And then here’s the function being called, with the Mesh and Material from above.

CreateStaticModel(rigObj.transform, mesh, mat);

What we did here is essentially take the Mesh and Material references  that were assigned to the prefab SkinnedMeshRenderer, and attached them to a newly created MeshFilter and MeshRenderer components.

body_static

That’s fantastic for making scarecrows. Yes, I know. But what we can take away from this is that if we cache references to the Mesh and Material from the SkinnedMeshRenderer, we can easily apply it back onto a new SkinnedMeshRenderer that’s attached to the rig in our scene.

Swapping the SkinnedMeshRenderer

So for the real deal, we’ll take our rig object that’s in the scene and add a new SkinnedMeshRenderer component to it. The difference between the static MeshRenderer and the SkinnedMeshRenderer is that we need to provide the bones used to skin the mesh. Remember our “Plane” object from a few sections back? That object has a valid SkinnedMeshRenderer, which contains the proper bone mappings from the rig in the scene to the mesh. Also notice that after we’re done with the “Plane” object, we destroy it, because we don’t need it any more. The Mesh and Material assignment is essentially the same as before. Here’s the function.

    private void CreateSkinnedModel(Transform parent, Mesh mesh, Material mat)
    {
        Transform planeXfrm = TransformUtils.GetTransformWithName(parent, "Plane", TransformUtils.StringMatch.Exact);
        SkinnedMeshRenderer rigSkinnedRenderer = planeXfrm.GetComponent< SkinnedMeshRenderer >();
        if (rigSkinnedRenderer != null)
        {
            GameObject obj = new GameObject("body_skinned");
            obj.transform.parent = parent;
            SkinnedMeshRenderer skin = obj.AddComponent< SkinnedMeshRenderer >();
            skin.sharedMaterial = mat;
            skin.sharedMesh = mesh;
            skin.bones = rigSkinnedRenderer.bones;

            GameObject.Destroy(planeXfrm.gameObject);
        }
    }

And the call.

CreateSkinnedModel(rigObj.transform, mesh, mat);

And finally, we have our fully skinned character model!

body_skinned

Conclusion

With this dynamic model loading process, you can probably see how we can easily load up any FBX resource that we want, dig into the prefab hierarchy, and pull out a mesh or object, skinned or not, and that having a uniform naming convention would help with this. You’d be able to call character “01” with costume “3” by using the string “[01.3]” in the filename to apply to rig “A”, and mix and match accordingly, while loading only the FBX files that you absolutely need to at any given time. It’s probably pretty obvious that this is just the bare bones for getting a custom character system working, but I think this is the core of the work that needs to be done. Another thing we can do is attach an Animator and apply different animation clips, which requires loading an AnimationController, but that’s beyond the scope of this subject. We can also play around with assigning different materials or textures, which is also out of scope of this blog entry. Anyway, I hope this helps some Unity developers who are interested in introducing character customization into their game, or even helps shed some light to non-developers into the work that’s involved in getting such a system working.

Make it fun!

  • Unity 5.1.1f1
  • Blender 2.74
Posted in Dev, Programming | 11 Comments

Building Character Customization (Part 1)

For the past week or so, I’ve been adding more features to my asset build system on the Blender side and asset loaders on the Unity side to support character customization.

What is character customization you ask? Read below! (And of course, if you’re not asking, skip to the Requirements section below.)

Overview

Character customization is a game system that allows a player to choose parts and settings of a character’s appearance to define a more unique experience for the player. Examples of this include different hairstyles, body types, clothes, face shapes, and skin color.

Games that include character customization provide the player with a sense of identity for the hero character. The more customizations a game supports, the more specifically the character can resemble the player’s identity, whether that identity is a reflection of himself/herself, or someone else, like a role model, icon, or celebrity.

Character customization has grown in popularity over the past decade or more in video games, and now, a lot of players take this system for granted. It seems that as games are released with more customization options, future games are expected to have more options than their predecessors.

From a game development perspective, the reality of implementing character customization is still considered a massive undertaking. Character models no longer have a one-to-one relationship between rig (skeleton) and mesh (geometry). Instead, rigs are shared between multiple models that represent any number of variations of hair, heads, eyes, body types, clothes, accessories, sexes, etc. As you might imagine, authoring the content for all of these variations can be boundless, and the number of combinations is exponential. And that’s why game developers have to place a certain limit on those choices, unless, of course, one of the following happens:

  • The game never gets released.
  • Someone pays for all the content to be developed.
  • Allow players to create their own content.

Okay, so that last bullet point is an article in itself, so I’m not even going to go there.

One last thing I want to mention before I move on. The reason why the rig is typically shared between character models is so that animations can be shared between each model. Animation is expensive all across the board. Animator authoring time, computer disk space, in-game memory and processing power all are significantly impacted by 3D skeletal animation, so sharing these animations between characters helps a ton.

Next up, here are the requirements for my game.

Requirements

Gameplay

For my game, I’m aiming to keep it as simple as possible while providing a fair amount of customization. The player can change:

  • Character
  • Hair Style
  • Hair Color
  • Head/Face Type
  • Skin Color
  • Costume

For launch, I plan on supporting one male and one female hero character, with around 2-4 different types for each category. That’s a reasonable amount of work for about a month of development time.

Technical

So, the gameplay requirements are just the surface of what we need. To achieve this, we need to nail down the technical requirements to get to the end result.

As a brief refresher, I’ve been working on an asset builder to convert .blend files created in Blender3D to FBX files for the Unity3D game engine to use. While this has been working great for my purposes, supporting character customization means that the asset builder, among other parts of my game, have to change a bit.

How It Works Now

blender_blend_to_fbx

Currently, nothing changes here between the input and output files, other than an FBX file being slightly smaller than a .blend file. But all geometry are bound to the skeleton in one blend file, and also in one output FBX file. I also added a rough graphic indicating that each file contains the armature, meshes, and animations, and that animations take up a significant amount of space, on disk as well as in memory.

Before I go into details about my solution, it’s worth mentioning that there is more than one way to do this. That said, I’m switching to Blender mode now.

Blender and Linking Fail

Blender offers a feature called Link. Most 3D packages have a similar feature though. My original intention was to create one .blend file that contains the rig (armature) and animations, and then a .blend file for each custom hair/head/costume. The theory behind this is, you can Link (or reference) an external .blend file (namely the rig/animation blend file) into the mesh (geometry) .blend file, so that changes that occur to the rig or animation only happen in one file. This enforces that the mesh is using the shared rig, so that the bone names, hierarchy, and vertex weights conform to that rig; a requirement for animation to work correctly in the run-time game environment.

blender_blend_file_link_armature

Alas, I was unable to get the Link feature working correctly for the animated Rigify armature in Blender, so I needed to use a workaround. If anyone out there reading this has actually figured that out, please drop me a line. Linking a reference to the rig is actually the most ideal way to author these models, and it’s unfortunate that I couldn’t get this working.

Using Blender Layers to Manage Meshes

The workaround for this problem is to work out of one .blend file for all custom meshes that are intended to be attached to a specific rig/animation combo. It’s a little cumbersome to work with, but it’s also 100% guaranteed to be using the appropriate bone names, hierarchy, and vertex weights. The reason it’s cumbersome is because multiple meshes need to be parented to the same armature in the same file. This can make it difficult to manage.

blender_multiple_meshes

To alleviate this problem, Blender (and almost all other 3D packages) has a layer system to hide objects in the scene that are not on a currently active layer. Moving hair, head, and body meshes to an appropriate layer makes managing them a little bit more tolerable.

blender_layers

The above image is from Blender’s Layer Management plugin that is not enabled by default. This needs to be enabled in User Preferences > Add-ons > 3D View: Layer Management to see this in action.

So, that solves one problem. Authoring custom character parts can be done in one file that is specific to a rig/animation combo. This means for any additional characters that may have different rigs and/or animations, a new .blend file must be created.

But we still don’t have a solution for actually exporting these for Unity to use. Of course, we can export the entire .blend file that includes ALL meshes. Unity would then load all mesh resources into the game when you load the FBX file. Then, your game code in Unity can look for the appropriate mesh name to load (i.e. “body”, “body.001”, “body.002”, etc.). I think this solution is suitable for games that don’t have a lot of custom meshes. However, I think it would also clutter the animation viewport in Unity, because all the custom meshes are skinned to the skeleton.

The problem with this is that the game is now unnecessarily loading all custom meshes, when you really only want one loaded at any one time.

blender_export_fbx_rig_and_all_models

This also means that the more custom meshes you have in the .blend file, the more valuable memory you’re wasting to load unused meshes,. Below, I describe how I implemented my solution.

Saving Memory and Sanity

Here’s a graphic outline of my solution.

blender_export_fbx_rig_and_models

As you can see, I’ve set up my build process so that the .blend file exports multiple FBX files. The key FBX file contains just the rig, animation, and a low-poly mesh*, and then all other FBX files contain the custom meshes bound to the same rig, and no animations. This way, when we load the FBX files in Unity, only the meshes that we need are ever loaded into memory. You’ll also notice that the rig is duplicated for each model FBX file. This is a small cost compared to loading any number of unused meshes.

* A note about the low-poly mesh. I use the Unity asset called RigifyToUnity which removes unnecessary bones from the Rigify armature on import of the FBX. RigifyToUnity actually doesn’t recognize the armature without a skinned mesh. Of course, you could simply modify RigifyToUnity to support skinless armatures, but I’d rather keep that script intact. Instead, I added a simple quad mesh and skinned that to the armature. This skinned quad will also prove important when importing the model into Blender, as it provides a SkinnedMeshRenderer component that can be used to swap in alternate models. I’ll write about that in the next part of this series.

Modifying the Asset Builder

I’ve used the scripts that I’ve written about in my post called Blender to Unity Asset Builder, and simply added support for Blender layers. I’m not going into full detail of how I implemented this, because it would simply take too long, but feel free to ping me if you have any questions. You always have the option to export your armatures/animations and meshes manually with Blender’s Export UI if you choose not to use Python scripts. But I’d still like to point out some utility code that will be useful if you choose to implement this yourself.

To access the names of the layers in a Blender scene, you first have to make sure that the Layer Management plugin is enabled, as I mentioned above. Then, you’d be able to access the names like this in Python:

 for layer in bpy.data.scenes[0].namedlayers.layers:
     print(layer.name)

The whole function looks like this:

# gets tuples consisting of [layer index, layer name]
def get_named_layers(sceneIndex=0):
    ret = []
    index = 0
    # NOTE: If namedlayers causes an error, make sure the Layer Manager plugin is loaded in Blender
    for layer in bpy.data.scenes[0].namedlayers.layers:
        if layer.name[:5] != 'Layer':
            print(layer.name)
            tuple = (index, layer.name)
            ret.append(tuple)
        index += 1
        print('index: ' + str(index))
 return ret

I have my rig layer contain the name “rig” in it, so I have a function that does this:

def get_rig_layer(sceneIndex=0):
    layers = get_named_layers(sceneIndex)
    for layer in layers:
        if layer[1][:3].lower() == 'rig':
            return layer
    return None

You can also select objects by layer using this Blender command:

 bpy.ops.object.select_by_layer(layers=layerIndex+1)

The layer indices are one-based, hence the +1.

And lastly, the options into the fbx export call:

bpy.ops.export_scene.fbx(filepath=outputfilepath, axis_forward='-Z', axis_up='Y', use_selection=export_selected, bake_space_transform=True, object_types=object_type_set, use_armature_deform_only=True, bake_anim=exportAnimations, use_anim=exportAnimations)

Where:

  • export_selected is a boolean indicating the export of the objects selected by layer above.
  • object_type_set specifies that we are exporting both the ‘ARMATURE’ and ‘MESH’, but nothing else.
  • exportAnimation specifies that animations are exported with the FBX, and this should only be exported with the rig file, and not with the mesh files.

The reason why I want to capture the layer names is so that the filenames of the export FBX files contain the layer name. I use the layer name when loading the FBX into Unity.

Conclusion

That’s about it for now. Part 2 of this article explains how I load the rig and model FBX files into Unity. This does seem like a lot of work to support character customization. It is a lot of work. But these sorts of systems and optimizations are what prevent a game from running like a snail, and some of these techniques are used across the industry, in one form or another. Also, when the people playing your game are super excited about changing the look of their character, isn’t it all worth it in the end?

  • Unity3D 5.1.1f1
  • Blender3D 2.74
  • Python 2.7.10
Posted in Dev, Programming | 3 Comments

Turning Concept Art into a 3D Model in Blender

I’ve been writing a lot about the software engineering side of game development recently, and today, I decided to write about the art side. For my next game, I’m aiming for a lighthearted “cartoony” look, while capturing the adventurous fantasy theme that’s fairly prevalent in the games industry today. These were my first few drawings of the first hero character. I plan to have others, but you have to start somewhere.

hero01

It eventually settled into looking like this:

hero_02

Here are some things to note about these two images. Over the course of iteratively drawing the same character, the character progressively receives an identity. The key is to keep on throwing things on the page until you capture a rhythm to discover what your mind wants, as well as what looks good. You can also tell that the last image is a lot more dynamic than the statically standing mockups in the first image. This probably contributes a little to a biased decision as to “what looks better”, but realistically, that’s the sort of stance the character would take in interactive software. He’s not going to be just standing around. You can also get a sense of the proportions that the character naturally looks better in. The proportion of head to body is just about right for a cartoon, being 3 heads tall. However, I’m drawn towards the slightly more bulky look of the hero at the bottom of the first image. It’s something that I’ll have to play around with.

Once you nail down an identity for the character, in order to absolutely get the proportions right, it helps to sketch the character in a model sheet. This is simply the character standing in a T-pose, from the different orthogonal viewpoints (front, side, back). In 3D modeling and animation, this pose is ideal for supporting the different ranges of movement for each joint.

model_01 model_02

Once you have a model sheet sketched out, you can scan it or take a picture with a smart phone or digital camera, to bring it onto the computer. Before scanning it, I outlined the figure with solid black, so that the form is more definitive.

hero01_model

 

You could also erase the remaining pencil marks before scanning, but I got lazy. Instead, you can get lucky by adjusting the brightness and contrast of the image in a paint program. Then, the inks show up by themselves. Another trick is to use non-photo blue pencil instead of a straight dark gray pencil, which will more easily be removed by the image adjustments I mentioned above.

hero01_model_inks

After cleaning up the model sheet, you can crop the front and side views such that they’re their own image files. It’s important to note that when cropping the images, the front and side images need to:

  • Match in pixel size
  • Match the character’s height and base position of the feet.

It’s also recommended to align the front image such that the character is divided equally down the center.

hero01_mdl_front hero01_mdl_side

Once the images are cropped appropriately, they can be loaded into Blender (or your 3D modeling package of choice) as background images.

background_image_position

Once they’re positioned properly, you can start modeling the character using the images as a guide. I start out with a cube and extrude the faces until I get a form that fits roughly around the background images.

side_template

front_template

You’ll notice that from the front view, the model is perfectly mirrored across the X axis. That’s not just me being careful. It’s actually half of a model with a Mirror modifier applied. This way, you only have to modify one side, and the other side is automatically manipulated for you.

After extruding out a basic blocky form, you can apply a subdivision surface modifier (available in almost all 3D modeling packages) to smooth out the mesh.

subsurf

At this point, I was happy with how the base model looks, so I applied the subsurface modifier as well as the mirror modifier to convert the model into a single editable mesh. The next step after that is marking the seams of the model. I’m not sure if Maya or 3ds Max has introduced seams, since I haven’t used those packages for several years now. I was introduced to the concept of seams in Blender.

Basically, marking edges of a model as “seams” denotes those edges as boundaries for “unwrapping” the model to apply textures. Imagine wearing a jacket and then taking it off and laying it on a flat surface to “unwrap” it for, say, sewing a piece of cloth on it. Since textures (images) are two-dimensional, a model has to be unwrapped to place those textures onto the model. This is known as UV mapping. Marking seams simply indicates where flat sections are located, to reduce any weird distortions from occurring, and to group those sections into logical, contiguous pieces.

geo_details_and_marked_seams

And here’s the unwrapped UVs mapped onto a flat texture. The texture contains all the pieces of the model and keeps the UV coordinates organized. I do need to mention that your UV coordinates won’t immediately be mapped as nicely as I have in the picture below. I had to rotate and move the pieces to keep them arranged nicely. However, Blender has a way of selecting all of the polys that are bound by a closed seam loop to make manipulation a lot easier.

base_texture_paint

As you can see, the texture only has base colors applied to the sections, so I still have to put in the details, which you can actually add with texture painting directly in Blender. The way that works is you can draw directly on the texture in the 3D viewport, as if you were painting designs on a real sculpture.

model_with_headThe head is actually a separate model, but honestly, right now it’s just a low-poly sphere that was squished a little. Also, the hair can definitely use a lot more work, but that’s a completely separate discussion.

Well, that’s about it. When I started this new project, I actually wasn’t planning to do all the intermediate steps between concept and modeling. I thought I could just jump into Blender and start. While there are probably plenty of 3D artists that can do this, I am not one of them =). The problem really wasn’t that I wouldn’t finish; it’s that I wasn’t sure if what I modeled would have matched well with my concept. Doing the intermediate steps helps to translate the concept into 3D with a lot more precision,  takes out a lot of the guesswork and uncovers any potential challenges in building the 3D model.

 

Posted in Art, Dev | Tagged | 1 Comment