Unleash the Power of Mesh and MetaHumans in Unreal Engine
Table of Contents:
- Introduction
- Installing the MetaHuman Plugin
- Importing and Setting Up the Mesh
- Creating a MetaHuman Identity Asset
- Setting Up the Neutral Pose
- Promoting the Frame for Tracking
- Adjusting the Frame
- Tracking the Asset
- Submitting the Mesh for Rigging
- Finalizing the MetaHuman
- Optional Step: Using MetaHuman Creator
Introduction
MetaHuman for Unreal is an experimental plugin that allows You to convert your character meshes into rigged MetaHumans. In this quick start guide, we will walk you through the process of using the MetaHuman plugin and creating your own MetaHuman assets. From installing the plugin to submitting your mesh for rigging, we will cover each step in Detail. So let's get started!
Installing the MetaHuman Plugin
Before you can start using MetaHuman for Unreal, you need to make sure that the plugin is installed and enabled in your project. We will guide you through the installation process and provide some important information about the plugin and its assets. Keep in mind that the assets used by the plugin may change between releases, but the MetaHumans it generates are tried and tested rigged assets.
Importing and Setting Up the Mesh
The first thing you need to do is import the mesh you want to convert to a MetaHuman. Whether it's a single mesh or multiple meshes, we will Show you how to import them correctly. We will also cover the importance of including the eyes in the import process. You can import FBX or OBJ files, and we will provide some recommendations for optimizing the import process for dense meshes.
Creating a MetaHuman Identity Asset
Next, we will guide you through the process of creating a MetaHuman identity asset. This asset is essential for submitting your mesh for rigging. You will learn how to access the MetaHuman asset submenu and navigate through its different sections. We will focus on the parts tree, guided workflow toolbar, promotion timeline, and viewport. Understanding these components will make it easier for you to navigate the MetaHuman workflow.
Setting Up the Neutral Pose
To ensure accurate tracking and rigging of your MetaHuman, it's crucial to set up the neutral pose correctly. We will provide detailed instructions on how to achieve a good neutral pose for your mesh. This includes ensuring unobstructed facial features, closed mouth, and relaxed facial expressions. Paying Attention to these details will significantly improve the quality of your MetaHuman.
Promoting the Frame for Tracking
In this step, you will learn how to promote the frame for tracking. We will explain what it means to promote a frame and provide guidelines for capturing a good tracking frame. This involves using a long lens, starting from a frontal view, and ensuring good symmetry. We will also cover the importance of selecting the Neutral Pose component and making adjustments to framing.
Adjusting the Frame
After promoting the frame, you will have the opportunity to make adjustments to the framing. We will explain how to navigate the camera view and ensure a clear and well-framed image for tracking. You can also compare different frame buffers to assess the quality of the tracking. We will provide instructions on how to toggle between frame buffers and utilize their contents effectively.
Tracking the Asset
Once you have captured a good frame and adjusted it to your liking, we will guide you through the tracking process. Tracking is a 2D process that relies on bright and even lighting. We will explain how to navigate the tracking interface and provide tips for achieving accurate tracking results. You will also learn how to lock your tracked frame to prevent accidental manipulation.
Submitting the Mesh for Rigging
After successfully tracking the asset, it's time to submit your mesh for rigging. We will cover the necessary steps to finalize your submission and send it to the backend for processing. You can expect a notification within a few seconds to a few minutes, depending on your internet connection. Once the mesh is processed, you will be able to inspect and further tweak your MetaHuman in MetaHuman Creator.
Finalizing the MetaHuman
With your MetaHuman now available for inspection and customization in MetaHuman Creator, we will provide some additional tips for finalizing your MetaHuman. We will discuss the importance of the Mesh to MetaHuman button and introduce some features specific to the Mesh to MetaHuman workflow. These features allow you to fine-tune the appearance of your MetaHuman by blending differences and adjusting influence.
Optional Step: Using MetaHuman Creator
In the final section of this guide, we will briefly introduce MetaHuman Creator and its role in the MetaHuman workflow. We will highlight some recent improvements and updates to the software, specifically related to the Mesh to MetaHuman workflow. You will learn how to make use of additive offset, adjust influence, and maintain control over the likeness and uniqueness of your MetaHuman.
(Article)
Introduction
MetaHuman for Unreal is an experimental plugin designed to transform your character meshes into fully rigged MetaHumans. This quick start guide will take you through every step of the process, from installing the plugin to finalizing your MetaHuman asset. So, let's jump right in!
Installing the MetaHuman Plugin
Before you can begin using MetaHuman for Unreal, ensure that the MetaHuman plugin is installed and activated within your project. Keep in mind that this plugin is experimental, meaning that the assets it employs may vary between releases. However, rest assured that the MetaHumans it generates have undergone rigorous testing and come with tried-and-true rigged assets.
Importing and Setting Up the Mesh
To get started, import the desired mesh that you wish to convert into a MetaHuman. If your model comprises multiple meshes, make sure to check the "Combine Meshes" option, as the eyes play a vital role in the conversion process. Both static and skeletal meshes are compatible, with FBX and OBJ files being acceptable formats. For swift import times, opt for OBJ files for dense meshes exceeding 200,000 vertices. However, you have the flexibility to choose the method that suits your needs best.
Creating a MetaHuman Identity Asset
Once your mesh is in place, it's time to Create a MetaHuman identity asset. This asset serves as an encapsulation of the workflow required to rig your mesh. Locate it in the MetaHuman asset submenu, noting that the Capture Data asset in the same submenu can be ignored for now. Familiarize yourself with the sections within the asset's GUI, paying particular attention to the parts tree, guided workflow toolbar, promotion timeline, and viewport. These components serve crucial functions at different stages of the workflow.
Setting Up the Neutral Pose
To ensure optimal tracking and rigging, it is imperative to establish a precise neutral pose. During this step, focus on granting an unobstructed view of facial features, unveiling any hair accessories covering them. The eyes should remain open without excessive widening, while the mouth ought to remain shut, with no teeth on display. By adhering to these guidelines, your MetaHuman will possess a neutral pose primed for subsequent steps.
Promoting the Frame for Tracking
The next phase entails promoting the frame for tracking. This process involves capturing a screenshot of your viewport while simultaneously tracking specific facial features. It is vital to select the Neutral Pose component and adhere to specific guidelines: employ a long lens, favor a frontal view with symmetrical features, and reveal the inner side of the eyelids and lips uniformly. Additionally, prioritize achieving optimal frame occupancy, with the face taking up the majority of the frame.
Adjusting the Frame
After promoting the frame, you have the opportunity to make adjustments as needed. Ensure your framing appears precisely as desired, taking into account factors such as symmetry and clear visibility of facial features. By navigating the camera view, you can fine-tune your framing with ease. Should you need to compare different frame buffers, take AdVantage of the toggle feature to assess the quality and overlap of meshes within each frame buffer.
Tracking the Asset
After capturing and adjusting the frame, it's time to initiate the tracking process. Tracking is a predominantly 2D task that thrives under bright and even lighting conditions. Familiarize yourself with the tracking interface and follow our expert tips for accurate tracking. Once you are satisfied with the results, lock the frame to prevent any inadvertent changes.
Submitting the Mesh for Rigging
Once you have successfully tracked the asset, it is time to submit your mesh for rigging. Finalize your submission, and the backend will take over, processing your mesh within seconds to minutes, depending on your internet connection. Once the processing is complete, your MetaHuman will be available for inspection and further customization in MetaHuman Creator, providing an avenue for fine-tuning your creation.
Finalizing the MetaHuman
With your MetaHuman now accessible in MetaHuman Creator, we offer a few additional tips for perfecting your creation. Gain insights into the Mesh to MetaHuman button, which enables you to adjust the appearance of your MetaHuman by blending desired differences and modifying influence. By expertly utilizing these features, you retain control over the uniqueness and resemblance of your MetaHuman.
Optional Step: Using MetaHuman Creator
In the final step of this guide, we introduce MetaHuman Creator as a powerful companion to the MetaHuman workflow. Specifically, we explore recent enhancements and updates pertaining to the Mesh to MetaHuman process. Notably, we Delve into the additive offset functionality and the ability to adjust influence, facilitating customization and ensuring your MetaHuman stands out. By balancing the likeness and Originality of your creation, you cultivate a MetaHuman that aligns seamlessly with your vision.
(Highlights)
- MetaHuman for Unreal is an experimental plugin for converting character meshes into rigged MetaHumans.
- Installing the plugin and enabling it within your project is the first step in the process.
- Importing and setting up the mesh, including the eyes, is crucial for successful conversion.
- Creating a MetaHuman identity asset encapsulates the workflow for rigging your mesh.
- Setting up the neutral pose and promoting the frame for tracking are key steps in the process.
- Adjustments to the frame and accurate tracking are essential for high-quality results.
- Submitting the mesh for rigging and finalizing the MetaHuman asset complete the process.
- Optional step: Using MetaHuman Creator for further customization and fine-tuning.
FAQ:
Q: Is the MetaHuman plugin compatible with all Unreal Engine projects?
A: Yes, as long as the plugin is installed and enabled, it can be used in any Unreal Engine project.
Q: Can I import multiple meshes to create a MetaHuman?
A: Yes, you can import multiple meshes, but make sure to tick the "Combine Meshes" option and include the eyes for optimal results.
Q: Are there any limitations on the file formats for importing meshes?
A: You can import FBX or OBJ files, depending on your preferences and the complexity of the mesh.
Q: How long does it take for the backend to process the submitted mesh?
A: Processing times may vary depending on your internet connection, but you can expect to receive a notification within a few seconds to a few minutes.
Q: Can I further customize my MetaHuman after it has been processed?
A: Yes, you can inspect and tweak your MetaHuman in MetaHuman Creator to achieve the desired results.
Q: Is the MetaHuman identity asset essential for rigging the mesh?
A: Yes, the MetaHuman identity asset serves as the foundation for the rigging process and must be set up correctly.
Q: Can I adjust the likeness and uniqueness of my MetaHuman?
A: Yes, by using features like additive offset and influence adjustment in MetaHuman Creator, you have control over the appearance of your MetaHuman.
Q: Are there any recommended lighting conditions for accurate tracking?
A: Bright and even lighting conditions are ideal for the tracking process, especially when using an albedo texture in unlit mode.
Q: Can I compare different frame buffers to assess the quality and overlap of meshes?
A: Yes, toggling between frame buffers allows you to compare and evaluate meshes within each frame, ensuring the best possible results.