unity3d


Jam Week 2019: Golem Jox 3

Golem Jox 3 (GJ3) is a prototype demo I developed during Schell Game’s Jam Week. Here’s quick preview:

Essentially, once a year the studio “closes” and allows its employees to work on whatever they want – within reason. Usually I work on something fighting game related by myself. Last year, for example, I worked on developing something that utilized my own rollback netcode solution in Unity. This year I decided to experiment with what I was calling a “Single Player Fighter” or “Fighting Game RPG.” Someone suggested a fighting adventure game; someone else, a turn-based fighter. I’m still not 100% sure what to call it, or if it’s even that unique as apparently there are a few games that have attempted similar approaches.
The game flow is rather simple.

  1. Player explores rather simple environments
  2. Player encounters an enemy
  3. Short dialog introduction
  4. The player’s turn begins where they attack the enemy, trying to perform the most damage in an allotted amount of time
  5. The enemy takes their turn
  6. Repeat 4 to 5 until someone wins
  7. If the player wins, return to 1; otherwise, end the game

Questions

So many, many questions…

I find one of the primary goals behind prototypes is to answer questions. Here are some of the questions I was trying to answer a lot of questions with GJ3’s prototype:

How should the player explore the environment?

I decided to just have the player explore the environment like they would if they were in a 2D fighting game.  I feel if I – especially within the 4 day jam period – tried to implement a top-down RPG exploration map or 4-way movement system, I wouldn’t have gotten to answer a lot of the other questions I was trying to answer. This also allows players to practice various moves, and I can “teach” how to perform different attacks in the environment.

I know it’s not the “right” input for that attack style…

Do character move sets evolve overtime? If so, how?

The Golem Jox theme sort of comes in for this. Golem Jox is a silly IP that I used for Jam Weeks in the past in which players control a “golem” or just an entity made of random things. You start off as “Juhnk,” a golem made of white cubes. As you progress, you swap and equip different “limbs.” Some of the limbs are more powerful than what you previously had, either granting new moves, having more attack power, or granting other changes such as increased max health. I sort of “force” limb switching by locking off sections without wearing different limbs. Most people during playthroughs didn’t switch back after going through a “door” and then realized the new limb or move set was better.

For this prototype the idea was:

  • Your base or body, torso and head, determined things like your walk speed, jump weight, max health, etc. Unfortunately, I didn’t get very far with these.
  • Left arm was for weak or light punch
  • Right arm was for strong or heavy punch
  • Left leg was for weak or light kick
  • Right leg was for strong or heavy kick

Players were then supposed to have a forward and/or back special move for each non-torso limb and a super attack, but this sadly didn’t happen due to time. In the prototype, they got unique limbs and some had unique special moves, but supers were never implemented.

How do you prevent players from sticking to one set?

Sadly this question is still unanswered. What I wanted to do is that the player does not level up based on how many matches they win, but by how often they use a limb. So, for example, if I’m level 1, and I use my left arm 5 times in one fight, and it levels up to level 2, then I level up to level 2 as well. However, if the same limb is level 4 and maxed out, then I will no longer gain EXP for using it. As a player, I’d have to make the choice, “Do I keep using a limb I’m really good with or do I equip a newer, maybe weaker one, so I can continue to level up overall.”

Again, unfortunately, due to time, I didn’t get this far, but is probably the first question I would try to answer next if I were to continue to polish this prototype.

I think the other, final question, that I’m not 100% sure is answered, is will a player enjoy this gameplay loop.  That’s difficult to tell without more work, but based on the playtest I had, I think, with a lot of polish to the combat itself, I think they could.

Learning New Tools:  Playables

Not my playable graph, but a sample one provided by Unity.

Learning is an important part of Jam Week.  Besides learning the answers to prototyping questions, I often decide to try something new.  This Jam Week in particular, I decided to work with Unity’s Playable System. One challenge with this game is that characters would need to be able to choose from a wide variety of animations; however, having all of these animations loaded at runtime would probably not be very efficient.

Take remedy this, I utilized the Playable System.  Unlike a Unity’s runtime animator controller, you can dynamic build a Playable System at runtime.  So, for example, if a character is equipped with a cubic right leg, I can utilize an animation, let’s call it, “cubic right kick.”  If I then equip a spherical right leg though, I can replace it with “spherical right kick.” All I have to do is rebuild the playable graph and apply it.  There is still a lot of finesse needed, such as how to make the animations blend cleanly, but the playable’s system ability to load animations dynamically definitely make them seem like a great.  The system also has some strict rules such as you MUST destroy a playable graph once you’re done with it.

Next Steps

Getting something playable — not pun intended — felt nice, but there is still a lot that can be done.

Getting as far as I did felt like a minor victory.

This is just a prototype, but also something I’d like to continue at a future time in some capacity.  I think the following are things I would like to answer in the future:

  • Should there be guard functionality?  If so, what does that look like?
  • Can this work with an original IP that does NOT involve swapping limbs?  
    • Would swapping “styles” like in Final Fantasy Tactics work better?
    • How many moves does a character need to make them feel “complete?”
  • Can you have multiple characters on a team?  
    • If you have multiple enemies on a team, can you change position and try to line up a “shot”?

And this is just a few questions.  Overall, there is a lot that would need to be done to make this a full game; however, I think Jam Week gave me a good head start to understand the idea a lot better.  For now though, I’m most likely going to continue with MerFight and give this a break for a few weeks before returning to it with fresh eyes. I’d like to eventually release this prototype to the public to try, but I think it needs a bit more polish before that.


Unity3D Tool – HairKit

Hair.  It’s probably one of the most difficult things for me to 3D model.  Whether I’m trying to go for more chunky, anime look for my hair or planar hair, it’s a challenge.  To try and remedy this, I created a tool a few months ago — maybe even over a year — that I called HairKit.  This post is about this tool, a brief overview, and where it is now.

HairKit Components

The following diagram demonstrates how the different components of HairKit come together.

Hair Kit Main is the main component that creates the Unity3D mesh.  This is made up of a set of Hair Kit Lines which require a Hair Kit Shape and a set of Hair Kit Line Points.  Finally, there is an optional component, HairKit Smoothed Line Helper, which can create a set of Hair Kit Line Points with smooth interpolation and spacing.

HairKit Main

The HairKitMain component is pretty straightforward.  When adding it to an object, it’ll automatically add a MeshFilter and a MeshRenderer component to the GameObject.  It will also create a new mesh named <GameObject’s Name> mesh.  Before lines can be rendered, a HairKitShape and HairKitLine needs to be set up.

HairKitShape

HairKitShape defines the shape that will be used when making lines.  It uses the children of this GameObject to define the shape. The Gradient is used by the gizmo system to draw the shape.  The UV Percentages, which will be used for the UV layout of the different meshes are based on the distance between points. You can automate this based on the by having Automate UV Percentages checked; if you uncheck it, it’ll force the UV Percentage array to the correct number but you can set the values as you wish.

You can rename the children of this game object more cleanly by pressing the Renamed Children Button.

You can also create an enclosed, circular shape by Pressing “Create Shape.”  The resulting shape will have the number of points specified by Count, minus one, and a radius of that specified.  You cannot set count lower than 3. So, to create a line, use 3 points, a triangle 4, etc.

The previous image showcases some examples of shapes created by different HairKitShape configurations.

HairKitLine

One you have a satisfactory shape, it’s time to start creating the line.

HairKitLine is a pretty full component.  For the quickest approach, assign a HairKitShape and then press “Add Child.” This will create a HairKintLinePoint.

Each point can then add a child or a sibling, which will be added to the line itself.

Locking a point allows the parent to be moved around without disrupting the position of the point.

The following .gif demonstrates adding a set of points:

Once you have a line you are happy with, you can add it to the Hair Kit Main see the shape itself.

HairKitLine has the most fields to edit, but for now, this covers the basics of the HairKit system.  The HairKitLineSmoother

Saving the Mesh

Once you are happy with the mesh, you can save out the mesh using the context menu of the Hair Kit Main component.

You can either Clone and Save the mesh or Skin the mesh.

Clone and Save will create a new GameObject except this game object will not have a HairKitMain component and the Mesh Filter will now refer to a newly created asset.

Skin is a bit more difficult.  The bones have to be set up in a specific way, cascading child-by-child for this to work properly.

Either way, the mesh should be saved out because the update methods used by the HairKit components are not the most efficient and should not be included in a final game.

Was It Useful?

In the end, I essentially recreated 3D Max’s loft tool; I realize now that this tool should probably be renamed LoftKitTool, but the original intent was for hair.  Recently, however, I discovered I can use it for creating trails in my current game rather quickly. The following are some .gif of it being used.

Anyway, I wanted to share about this as I may release this one day as a Unity Package or maybe even in the Unity Asset store.  Anything you’d like to see in the tool, if you would pay for said tool or how much, or any comments will be greatly appreciated.


ProtoFighter Dev Blog 01 4

So for the past couple of months I’ve been working on a new fighting game prototype.  After discovering TrueSync by Exit Games, I’ve been trying very hard to create a new fighting game with it.  Again, one of my biggest regrets with Battle High is that I was never able to implement multiplayer before its release.  I definitely feel that TrueSync could definitely help me achieve that!  Anyway, I decided to write a little bit about the game and what I’m trying to do with it.

ProtoFighter

I chose this name because what I made was a prototype, and I wanted to make this clear.  I decided to use only assets from the Unity3D Asset Store, which TrueSync already is.  This includes my characters, audio, and more!  Here is a short list of some of the assets I am using:

Goals

I had several goals while making this prototype.

Learn TrueSync With a Focus on a Fighting Game

My first goal was to learn TrueSync and make a game using it.  I think I accomplished this.  In fact, it’s not my first TrueSync experiment.  Diamonds Not Donuts, a small game I released on itch.io for free, is!  That being said, for ProtoFighter, I wanted to focus more on fighting games and various issues concerning them.  ProtoFighter has a lot of gameplay functionality that most fighters do — blocking, jumping, attacking, special moves, supers, rounds, etc.  Obviously it’s missing a lot to be a complete fighting game package — single player modes, balance is a MESS, more characters, etc.  Again, for a pre-pre-pre alpha, I think I achieved my goal, but of course, when it comes to TrueSync, there are still a ton of questions I have and hope to continue to answer them as I expand upon this prototype.

Make a Fighter That Is Slightly More Accessible Than Most

Though not TrueSync related, I’ve always wanted to try and make a fighting game that was a bit more accessible to the average player.  Maybe not as extreme as Fantasy Strike, but something that I could still explain relatively easily.

In ProtoFighter, though I sadly haven’t released a tutorial yet, I tried to do this.  Essentially, instead of performing quarter circle attacks, I simplify this to forward or back plus an attack.  Now, a lot of people would immediately say this oversimplification could cause issues such as instant dragon punches or anti-airs, so to solve this I did two things.  Firstly, all initial moves such as forward+punch have rather long start-up and are reserved for moves like overheads or projectiles.  Then, every special move has a “secondary” special that branches from it.  For example, forward+punch may begin an overhead but then pressing up before the attack activates, a secondary attack, probably an anti-air attack, would be performed.  The hope is that performing the initial move and then the secondary move will require just enough time and frames that the anti-air move won’t be so instantaneous.  Maybe this won’t help, but the idea it’s simple to actually perform an attack, but requires some dexterity and memorization to cancel one move into another properly.

A secondary idea I then had is to still allow players to perform attacks using quarter-circles; however, these players would be rewarded with a slight meter bonus, so you don’t have to perform moves properly to compete or play, but players who can are rewarded slightly for taking the time and effort to perform more complex inputs.  I can’t really tell if this input system will be good or not until someone tests it, which is why I released the prototype.

Create a Framework

My third goal was to begin creating a framework so that I can create future titles, TrueSync or not, more quickly.  A lot of games I work on are usually fighting game influenced, so I wanted to construct a framework so that creating future titles, whether 2D or 3D, would be easier in the future.  Though not perfect, I definitely tried to abstract more of my classes and functionality and believe I could quickly go from this 2.5D fighting game to a 3D game rather quickly with few changes.

TrueSync Tips

So, for this fighting game, I learned a good amount about TrueSync.  TrueSync attempts to be deterministic, allowing a local player’s inputs to be immediately respected, passed over the network, and compared to the game’s state and rolled back if there are inconsistencies found and resimulated.

The issue though is that Unity3D wasn’t built to be deterministic.  Its use of floats and random system for example can cause various issues.  It’s animation system also isn’t deterministic so trying to perfectly simulate results across two machines can be rather problematic.  Anyway, here are some tips I found were helpful for completing my prototype.

Note, these tips were written for Unity3D version 2017.1.1f1 and TrueSync version 1.1.0B.

Don’t “Press” Inputs

TrueSync uses a unique method to capture and send input, it’s called OnSyncedInput.  Here’s an example of how it works.

public class MyTSClass : TrueSyncBehaviour
{
    public override void OnSyncedInput()
    {
        TrueSyncInput.SetBool(0, Input.GetKeyDown(KeyCode.Space));
    }
}

So in the above, TrueSyncInput is used to pass inputs over the network.  The first argument is a byte, used as a key.  I’m just using 0 for now, but if you use multiple,  you should probably assign them to a constant.  Then, I’m using Input.GetKeyDown to send a bool if space is down or not.  One issue with this method is that it is performed similarly to OnFixedUpdate so calls such as “Input.GetKeyDown” don’t work consistently as when OnSyncedInput is called, Input.GetKeyDown is sometimes missed.  To resolve this for button inputs, here’s what I did:

public class MyTSClass : TrueSyncBehaviour
{
    bool hasPressed = false;

    public override void OnSyncedInput()
    {
        bool singlePress = false;
        if (Input.GetKey(KeyCode.Space))
        {
            if (!hasPressed)
            {
                hasPressed = true;
                singlePress = true;
            }
        }
        else if (hasPressed)
        {
            hasPressed = false;
        }

        TrueSyncInput.SetBool(0, singlePress);
    }
}

This change uses a bool that is set to true when OnSyncedInput is executed if the space bar is currently down. The toggle is then reset once the spacebar is no longer being held down.  The bool that is actually pased in TrueSyncInput.SetBool is only set if the keyboard is down AND hasPressed was false before being set to true.  This way, the first entry of TrueSyncInput will be true for only one execution of OnSyncedInput.  This should prevent any issues with OnSyncedInput missing an input as the average button press usually occurs for a few frames.  I don’t use this method exactly in ProtoFighter, but the idea is similar.  Instead of doing separate Booleans for each input type — up, down, left, right, etc. — I use an integer and bitmasking to change it during OnSyncedInput.

Treat TrueSync Like A Separate Engine

This sounds silly as Unity3D is a game engine; however, to make TrueSync’s determinism work properly, you have to use a lot of unique structs and classes that it introduces.  There’s FP, or FixedPoint, for float values for example and TSVector for Vector3’s.  Also, TrueSync has its own Transform class (TSTransform) that does not have all the functionality — at least now — that Unity3D’s Transform class has.  You can’t use children the same way and certain methods such as those that convert transform information from world to local space are missing.  Overall, you can’t just take a finished game and integrate TrueSync into it quickly.

One trick I had to do, for example, was figure out a way to align character hit spheres to certain joints.  In a normal setting, I could just use the following:

Animator anim = GetComponent<Animator>();
Transform t = anim.GetBoneTransform(HumanBodyBones.Chest);
Vector3 chestPos = t.position;

However, one problem is that this creates a Vector3 and even though I can convert the position to TrueSync’s Vector3 equivalent, a TSVector, they may be different values between the multiple players due to floating point precision errors.

To resolve this, I built a tool to cycle through my animations and store important point information as a TSVector  in a ScriptableObject.  I don’t save the position though; instead, I save the vector from the center to this point.  So, to get where the chest would be in my animation, it would be something like the following:

TSVector localChestVector = GetChestPosition(frame);
TSVector worldChestVector = tsTransform.position + tsTransform.rotation * localChestVector;

So, in the above, I’ve gotten a local vector for my chest position and then used the position of my player and rotation to define the world position for my chest now.  You’ll also notice that I’ve used a frame.  This is because a lot of fighting game interpret things as frames, and I believe interpreting your deterministic game in TrueSync is a lot easier to understand through the concept of frames than through time.  Even though my 3D animation is made up of curves, I store different bone information in these TSVectors so they can be referred to later regardless of the rotation or position of my character.  I also do a similar technique for moving a character by their root animation without actually having the Animator drive it.

No Animators — At Least How You Think

As of right now, TrueSync doesn’t have an Animator class.  For fighting games, this can be an issue since animations and the accuracy of said animations is so important.  To handle this, I did the following:

  • Stored all of my animator data in a separate data structure, mostly just my transition parameters and conditions
  • Muted ALL of my animation transitions
  • Disabled the Animator Component
  • Use Animator.Update(float)

So, even though the animator is disabled, Animator.Update(float) still allows the Animator to be updated.  Even though you do have to use a float instead of an FP, the amount I update is determined by the frame I’m supposed to be on, so my update function looks like this.

FP syncedFrame;
FP localFrame;
Animator anim;

private void Update()
{
    anim.Update(((TrueSyncManager.DeltaTime) * syncedFrame - localFrame).AsFloat());
    localFrame = syncedFrame;
}

So, here I have syncedFrame which is the frame of my animation that is set during OnSyncedUpdate.  Then I substract the syncedFrame to the localFrame and convert it to a float value.  I then set the localFrame to the syncedFrame.  I used FP instead of integers in case I want to play the game in slow motion.  This still needs some tweaking, however, but it gets the general idea across.

Overall, using Animator.Update(float) is great because it allows me to still get a lot of the functionality of Animators

  • Transition blending
  • IK
  • Mirroring
  • Humanoid rigs

But with more control.  This is one reason all transitions in the Animator are muted actually.  Because I don’t want transitions to happen automatically and switch states suddenly if there is rollback.  Doing it more manually allows me to switch state when I need to.

Just one small part of my AnimatorController; the red arrows show that my transitions are muted.

Anyway, the future of ProtoFighter is uncertain.  I will most certainly not release this as a full game, but instead a fighting game demo.  I know in this current 2D version I’d like to do the following:

  • Add rooms and lobbies instead of the “Ranked Match” system it uses now
  • Add stage select
  • Add an interactive tutorial
  • Balance and clean up the existing characters, Protolightning and Protaqua
  • Start looking into AI and single player modes

Overall, the goal with this game is to eventually get a framework to a place where I can experiment with a variety of gameplay styles and making something myself later down the road, hopefully sooner rather than later.  Maybe I can even use this to integrate TrueSync into Battle High 2 A+ — though I make zero promises.

ProtoFighter is available on itch.io & Game Jolt for free!  If you download the game and play them, I’d love to hear your feedback — but make sure you try the multiplayer as that’s the main area I’m trying to focus on.  Also, if you have any questions on TrueSync, I’d love to try and help as I think it’s a great asset and can help give online functionality to a lot of new indie game content — fighting games and other — in the future.


Unite 2017

I recently returned from Austin, Texas from Unite 2017 one of several conferences Unity holds annually to discuss upcoming features about the Unity3D game engine.

I usually write long posts about my experience at these conferences, but this year was a bit, not a letdown per se, but I just didn’t feel I got as much out of it as I have in previous years.  I didn’t leave feeling inspired and invigorated.

First, there weren’t a ton of sessions like in previous years, in fact, the first day of the conference, only the expo hall was open.  It was a nice expo, but also felt lacking in ways.  Last year Unite was held in a difference convention center, so it’s possible that the larger expo floor made it feel smaller, but regardless, having no sessions the first day just felt odd and had myself and others question “What’s the point?”

Overall, none of my sessions blew me away nor did the keynote.  Most of them were great overviews.  There was a talk about different network architectures from Exit Games that I liked as well as one that went over the character building techniques of the Rick & Morty VR game.  There was also a decent discussion and demonstration about future AR features coming to Unity3D in the coming years.

I think that would be my next biggest complaint.  A LOT of AR and VR, almost too much.  I understand they are exciting technologies, but I wish, like last year, there were a few more talks about design itself or just more variety in general.  I always feel that no matter how good your engine or tools are, if your games aren’t designed well, it won’t matter.  Maybe there were, and I just missed them.

Overall, I think it was worth the price of admission but am definitely on the fence if I’ll go next year or go to a different conference such as GDC instead.


Unity3D Script: Quick Texture Editor

Last year I wrote a Unity3D editor script for combining textures as well as swapping and combining their different color channels.


Someone on YouTube recently commented, asking for more details. Since I haven’t touched the script in over a year, I decided to just make the script public. It’s not perfect and some of my comments don’t make sense. I’ll probably clean it up in the future, or at least add better documentation.  I sound very professional right now.

via GIPHY

What this script does:

  • Allows you to swap color channels
    • For example, take the red channel of a grayscale smoothness map and apply it to the alpha channel of your albedo texture
  • Allows you to combine texture onto a new, larger texture
    • You have two 512×512 texture and want to combine them onto one 1024×512 texture

What this script does NOT do:

  • Resize textures
  • Rearrange meshes’ UVs
  • Paint onto textures
  • Create textures other than PNGs
using UnityEngine;
using System.Collections;
using System.Collections.Generic;
using UnityEditor;
using System.IO;
 
namespace MattrifiedGames.Assets.TextureHelpers.Editor
{
    /// <summary>
    /// Editor window for quickly swapping, rearranging, and other things to textures in Unity3D.
    /// </summary>
    public class QuickTextureEditor : EditorWindow
    {
        /// <summary>
        /// A list of the current affected textures.
        /// </summary>
        List<TextureInformation> texturePositionList;
 
        /// <summary>
        /// If true, the new texture's size will be forced to the nearest power of two.
        /// </summary>
        bool forcePowerOfTwo = false;
 
        /// <summary>
        /// Width of the new texture.
        /// </summary>
        int newTexWidth = 512;
 
        /// <summary>
        /// Height of the new texture.
        /// </summary>
        int newTexHeight = 512;
 
        /// <summary>
        /// The name of the new texture to be created.
        /// </summary>
        string newTextureName = "New Texture";
 
        /// <summary>
        /// Operations affecting different channels.
        /// </summary>
        public enum ChannelOperations
        {
            Ignore = 0,
            Set = 1,
            Add = 2,
            Subtract = 3,
            Multiply = 4,
            Divide = 5,
        }
 
        public struct ChannelBlendSetup
        {
            public ChannelOperations rCU, gCU, bCU, aCU;
        }
 
        /// <summary>
        /// Information about each texture being used to create the new texture.
        /// </summary>
        internal class TextureInformation
        {
            /// <summary>
            /// The texture being used.
            /// </summary>
            public Texture2D texture;
 
            /// <summary>
            /// The x and y position of the new texture.
            /// </summary>
            public int xPos, yPos;
 
            /// <summary>
            /// The x and y position of the new texture.
            /// </summary>
            public int width, height;
 
            /// <summary>
            /// Should a multiply color be used?
            /// </summary>
            public ChannelOperations blendColorUse = ChannelOperations.Ignore;
             
            /// <summary>
            /// The color to be blended with the texture.
            /// </summary>
            public Color blendColor;
 
            public ChannelBlendSetup rBS = new ChannelBlendSetup() { rCU = ChannelOperations.Set },
                gBS = new ChannelBlendSetup() { gCU = ChannelOperations.Set },
                bBS = new ChannelBlendSetup() { bCU = ChannelOperations.Set },
                aBS = new ChannelBlendSetup() { aCU = ChannelOperations.Set };
 
            public void OnGUI(string label, ref int refWidth, ref int refHeight)
            {
                if (texture != null)
                    label = texture.name;
                texture = (Texture2D)EditorGUILayout.ObjectField(label, texture, typeof(Texture2D), false);
 
                if (GUILayout.Button("Set as new texture size."))
                {
                    refWidth = width;
                    refHeight = height;
                }
 
                if (texture == null)
                {
                    Vector2 s = new Vector2(width, height);
                    s = EditorGUILayout.Vector2Field("Size", s);
                    width = Mathf.Max(1, Mathf.RoundToInt(s.x));
                    height = Mathf.Max(1, Mathf.RoundToInt(s.y));
                }
                else
                {
                    width = texture.width;
                    height = texture.height;
                }
 
                blendColorUse = (ChannelOperations)EditorGUILayout.EnumPopup("Blend Color Usage", blendColorUse);
                if (blendColorUse != ChannelOperations.Ignore)
                    blendColor = EditorGUILayout.ColorField(blendColor);
                else
                    blendColor = Color.white;
 
                Vector2 v = new Vector2(xPos, yPos);
                v = EditorGUILayout.Vector2Field("Pos", v);
                xPos = Mathf.RoundToInt(v.x);
                yPos = Mathf.RoundToInt(v.y);
 
                EditorGUILayout.BeginHorizontal();
 
                EditorGUILayout.BeginVertical();
                GUILayout.Label("");
                GUI.color = Color.red;
                GUILayout.Label("R");
 
                GUI.color = Color.green;
                GUILayout.Label("G");
 
                GUI.color = Color.blue;
                GUILayout.Label("B");
 
                GUI.color = Color.white;
                GUILayout.Label("A");
                EditorGUILayout.EndVertical();
 
                ChangeBlendSetup("R", ref rBS, Color.red);
                ChangeBlendSetup("G", ref gBS, Color.green);
                ChangeBlendSetup("B", ref bBS, Color.blue);
                ChangeBlendSetup("A", ref aBS, Color.white);
 
                EditorGUILayout.EndHorizontal();
            }
 
            private void ChangeBlendSetup(string p, ref ChannelBlendSetup bS, Color guiColor)
            {
                EditorGUILayout.BeginVertical();
                GUI.color = guiColor;
                GUILayout.Label(p);
                GUI.color = Color.white;
                 
                bS.rCU = (ChannelOperations)EditorGUILayout.EnumPopup(bS.rCU);
                bS.gCU = (ChannelOperations)EditorGUILayout.EnumPopup(bS.gCU);
                bS.bCU = (ChannelOperations)EditorGUILayout.EnumPopup(bS.bCU);
                bS.aCU = (ChannelOperations)EditorGUILayout.EnumPopup(bS.aCU);
                 
                EditorGUILayout.EndVertical();
            }
 
            internal void EditColor(ref Color colorOutput, ref Color colorInput)
            {
                EditChannel(ref colorOutput.r, ref colorInput, rBS);
                EditChannel(ref colorOutput.g, ref colorInput, gBS);
                EditChannel(ref colorOutput.b, ref colorInput, bBS);
                EditChannel(ref colorOutput.a, ref colorInput, aBS);
            }
 
            private void EditChannel(ref float outputValue, ref Color inputColor, ChannelBlendSetup bs)
            {
                EditChannel(ref outputValue, ref inputColor.r, bs.rCU);
                EditChannel(ref outputValue, ref inputColor.g, bs.gCU);
                EditChannel(ref outputValue, ref inputColor.b, bs.bCU);
                EditChannel(ref outputValue, ref inputColor.a, bs.aCU);
            }
 
            private void EditChannel(ref float output, ref float input, ChannelOperations channelUsage)
            {
                switch (channelUsage)
                {
                    case ChannelOperations.Set:
                        output = input;
                        break;
                    case ChannelOperations.Add:
                        output += input;
                        break;
                    case ChannelOperations.Divide:
                        output /= input;
                        break;
                    case ChannelOperations.Multiply:
                        output *= input;
                        break;
                    case ChannelOperations.Subtract:
                        output -= input;
                        break;
                    case ChannelOperations.Ignore:
                        return;
                }
            }
        }
 
         
 
        // Add menu named "My Window" to the Window menu
        [MenuItem("Tools/Quick Texture Editor")]
        static void Init()
        {
            // Get existing open window or if none, make a new one:
            QuickTextureEditor window = (QuickTextureEditor)EditorWindow.GetWindow(typeof(QuickTextureEditor));
            window.Show();
        }
 
        /// <summary>
        /// On GUI function that displays information in the editor.
        /// </summary>
        void OnGUI()
        {
            OnGUICombineTextures();
        }
 
        /// <summary>
        /// Quickly gets the importer of a specified asset
        /// </summary>
        /// <typeparam name="T">The type of importer to be used.</typeparam>
        /// <param name="asset">The asset whose importer is being referenced.</param>
        /// <returns>The importer, converted to the requested type.</returns>
        private T GetImporter<T>(UnityEngine.Object asset) where T : AssetImporter
        {
            return (T)AssetImporter.GetAtPath(AssetDatabase.GetAssetPath(asset));
        }
 
        private void SetupList<T>(ref List<T> list, int p)
        {
            if (list == null)
                list = new List<T>();
            while (list.Count <= p)
                list.Add(default(T));
        }
 
        private T GetFromList<T>(ref List<T> list, int p)
        {
            SetupList(ref list, p);
            return list[p];
        }
 
        private void DefineTexturePose(int index)
        {
            SetupList(ref texturePositionList, index);
            if (texturePositionList[index] == null)
                texturePositionList[index] = new TextureInformation();
 
            texturePositionList[index].OnGUI("Texture " + index, ref newTexWidth, ref newTexHeight);
        }
 
        private static Color DivideColor(Color c)
        {
            return new Color(1f / c.r, 1f / c.g, 1f / c.b, 1f / c.a);
        }
 
        Vector2 scroll;
        private void OnGUICombineTextures()
        {
            // Defines information about the new texture.
            newTextureName = EditorGUILayout.TextField("New Texture Name", newTextureName);
 
            forcePowerOfTwo = EditorGUILayout.Toggle("Force Power of 2", forcePowerOfTwo);
            if (forcePowerOfTwo)
            {
                newTexWidth = Mathf.ClosestPowerOfTwo(EditorGUILayout.IntField("Width", newTexWidth));
                newTexHeight = Mathf.ClosestPowerOfTwo(EditorGUILayout.IntField("Height", newTexHeight));
            }
            else
            {
                newTexWidth = EditorGUILayout.IntField("Width", newTexWidth);
                newTexHeight = EditorGUILayout.IntField("Height", newTexHeight);
            }
 
            EditorGUILayout.Separator();
 
            scroll = EditorGUILayout.BeginScrollView(scroll);
            if (texturePositionList == null)
                texturePositionList = new List<TextureInformation>();
            for (int i = 0; i < texturePositionList.Count; i++)
            {
                DefineTexturePose(i);
            }
 
 
            EditorGUILayout.BeginHorizontal();
            if (GUILayout.Button("Add Texture"))
            {
                texturePositionList.Add(new TextureInformation());
                return;
            }
            if (GUILayout.Button("Remove Texture"))
            {
                texturePositionList.RemoveAt(texturePositionList.Count - 1);
                return;
            }
            EditorGUILayout.EndHorizontal();
 
            EditorGUILayout.EndScrollView();
 
            EditorGUILayout.Separator();
 
            if (GUILayout.Button("Save Texture"))
            {
                int textureCount = texturePositionList.Count;
 
                Texture2D newTex = new Texture2D(newTexWidth, newTexHeight);
                newTex.name = string.IsNullOrEmpty(newTextureName) ? "New Texture" : newTextureName; 
                Color[] mainColors = new Color[newTex.width * newTex.height];
                newTex.SetPixels(mainColors);
 
                List<TextureInformation> pulledTextures = new List<TextureInformation>();
                for (int i = 0; i < textureCount; i++)
                {
                    TextureInformation pos = GetFromList(ref texturePositionList, i);
                    if (pos == null)
                        continue;
                    else if (pos.texture == null)
                    {
                        pos.texture = new Texture2D(pos.width, pos.height);
                        pos.texture.name = "Texture " + i;
                        Color[] c = new Color[pos.width * pos.height];
                        for (int j = 0; j < c.Length; j++) c[j] = pos.blendColor; pos.texture.SetPixels(c); pos.texture.Apply(); } if (pos.texture.width + pos.xPos > newTex.width ||
                        pos.texture.height + pos.yPos > newTex.height)
                    {
                        Debug.LogWarning(pos.texture.name + " will not fit into new texture.  Skipping.");
                        continue;
                    }
 
                    pulledTextures.Add(pos);
                }
 
                for (int i = 0; i < pulledTextures.Count; i++)
                {
                    EditorUtility.DisplayProgressBar("Saving Texture", "Working on Texture " + i, (i + 1) / (pulledTextures.Count));
 
                    TextureImporter ti = GetImporter<TextureImporter>(pulledTextures[i].texture);
                    bool wasReadable = ti.isReadable;
                    bool wasNormal = ti.normalmap;
 
                    if (wasReadable != true)
                    {
                        ti.isReadable = true;
                        ti.SaveAndReimport();
                    }
 
                    if (wasNormal)
                    {
                        ti.normalmap = false;
                        ti.SaveAndReimport();
                    }
 
 
                    Color[] pulledColors = pulledTextures[i].texture.GetPixels();
 
                    if (pulledTextures[i].blendColorUse != ChannelOperations.Ignore)
                    {
                        for (int c = 0; c < pulledColors.Length; c++)
                        {
                            switch (pulledTextures[i].blendColorUse)
                            {
                                case ChannelOperations.Set:
                                    pulledColors = pulledTextures[i].blendColor;
                                    break;
                                case ChannelOperations.Add:
                                    pulledColors += pulledTextures[i].blendColor;
                                    break;
                                case ChannelOperations.Divide:
                                    pulledColors *= DivideColor(pulledTextures[i].blendColor);
                                    break;
                                case ChannelOperations.Multiply:
                                    pulledColors *= pulledTextures[i].blendColor;
                                    break;
                            }
                        }
                    }
 
                    Color[] colorsToModify =
                        newTex.GetPixels(pulledTextures[i].xPos, pulledTextures[i].yPos, pulledTextures[i].texture.width, pulledTextures[i].texture.height);
                     
                    // Adds these colors instead of setting.  Slower, but allows for combining channels or for combining reasons.
                    for (int c = 0; c < colorsToModify.Length; c++)
                        pulledTextures[i].EditColor(ref colorsToModify, ref pulledColors);
 
                    newTex.SetPixels(pulledTextures[i].xPos, pulledTextures[i].yPos, pulledTextures[i].texture.width, pulledTextures[i].texture.height,
                        colorsToModify);
 
                    if (ti.isReadable != wasReadable)
                    {
                        ti.isReadable = wasReadable;
                        ti.SaveAndReimport();
                    }
 
                    if (wasNormal)
                    {
                        ti.normalmap = true;
                        ti.SaveAndReimport();
                    }
                }
 
                SaveTexture(newTex);
 
                EditorUtility.ClearProgressBar();
            }
        }
 
        void SaveTexture(Texture2D texture2D)
        {
            byte[] bytes = texture2D.EncodeToPNG();
 
            File.WriteAllBytes(Application.dataPath + "/" + texture2D.name + ".png", bytes);
 
            AssetDatabase.Refresh();
        }
    }
}

If you use the script, credit would be nice. If you have any questions, feel free to ask here or on my twitter.