Last Updated: 2025-4-22
As the original author and primary developer of the Latios Framework for Unity’s ECS, I regularly run into bugs, inadequate functionalities, and pitfalls within the ECS ecosystem. This has resulted in lots of “hacks” in the framework to patch up problems. This is a living document describing the issues and hacks. My hope is that the Unity ECS team will find this to be a valuable reference to help improve the quality of their packages.
This document is organized into various categories based on severity of concern as well as how easy I believe each is to fix.
This document does not include all possible features the Latios Framework may eventually implement if no official solution is provided. If you would like to learn about such features, it is best to ask me. I’ve focused this list specifically on things I would like to see. There are a lot more things I could list that I know many users have asked for, and arrive to the Latios Framework due to these missing features or other issues. If you would like to get a list of these things in a specific area, or if you have any questions about anything listed here, feel free to reach out to me!
These are high-severity items that is preventing the Latios Framework from functioning correctly, with no plausible workarounds.
At the time of writing there are no real blockers for Entities 1.3.14. There is a blocker for updating to 1.4.0-exp.2, but it is an established known issue.
These are items which are entirely new features not related to any current problems, but solely based on a desire to do new things.
Companion Game Object lights are bad for any dynamic entities that are spawned and despawned. It would be awesome if we could feed unmanaged light data into the SRP, since that’s how it uses it internally.
Projected decals are currently companion Game Objects which use manual instancing. But mesh decals can use DOTS-Instancing just fine, and is compatible with the decal render features/passes. Can decal projection material properties support DOTS instancing so that we can have pure entity decals?
With the new environment system being killed off, please invest in some proof-of-concepts for baking environments authored with currently available tools into entities. I already have too much on my plate to take this on too.
Cinemachine does not play nice with the job system, or ECS physics. I’d like to see better integration.
These are items which are preventing or creating unnecessary friction for specific new features or optimizations of the Latios Framework from being developed. They may also be creating undesirable effects on usage of the framework and reducing quality on the overall solution.
Because the story around collections within entities is really bad (largely a
C# issue), some of us try to pack custom collections within dynamic buffers.
However, currently, this breaks if you try to serialize remappable types such as
blobs or UnityObjectRef
. Those can usually be fixed with a baking system and a
post-process system when the subscene loads. However, entity remapping on
instantiation is way more difficult to pin down without extreme dictatorial
rules on when and how entities can be spawned. Allowing specifying a custom
remapping function would solve this, at least for entities. Similarly, even with
asmref, it is still not possible to override an entire dynamic buffer in the
inspector.
If a zero-sized component or chunk component is added or removed on all entities in the chunk, the chunk’s archetype is converted in-place, which is a great optimization. However, when there are only a couple of entities in the chunk because the source archetype represents a temporary state, then this conversion in-place will leave lots of chunks with only a small number of entities each, causing fragmentation. It would be awesome if as an additional check, if there is another chunk that can accommodate the entire existing chunk, that the entities move to the new chunk rather than perform the in-place conversion.
Most of the time, I find myself using sub-optimal structural change sequences just to avoid this edge case. It is a constant minefield, because chunk components don’t work in live baking, and I also need to rely on cleanup components a lot since as a package provider, I don’t want to impose a destroy pipeline.
It is very frustrating not being able to extend systems and jobs with our own
source generation. A lot of times, I really want the autogenerated structure of
IJobEntity
and its query, but I want to leverage it in a custom algorithm that
might schedule it multiple times or wrap it and invoke the IJobChunk.Execute()
method with cached chunks. But this isn’t possible, because setting up the job
is tied to the schedule method and not something we can invoke. Similarly, the
method rewriter in systems often gets in my way, threatening to clobber anything
I do by replacing my source-generated code with its full implementation. And
IJobEntity
will fail to compile if anything inside it references a
source-generated type in the same assembly, even if such a type is not in an
attribute or Execute parameter (which should be all IJobEntity
ever cares
about). Also, I can’t even access the query generated for IJobEntity
to even
calculate the entity or chunk count.
It is often optimal to design systems so that assumptions can be made about intertwined components. For example, an IEnableableComponent might be enabled whenever a DynamicBuffer’s values changed. However, incremental baking won’t understand how to patch this data properly. It won’t know that the disabled component in the editor world needs to be re-enabled even though it was initially baked enabled when the authoring changes the buffer’s contents. Defending against this issue without hiding actual runtime bugs or brining the editor to the crawl is a nightmare. I’d really appreciate some list of everything live baking changed so that I can run dedicated systems that patch up invalidated assumptions.
Entities 1.2 broke determinism in a really bad way. Entity IDs are no longer
deterministic, meaning the only way to order a list of entities in a
deterministic way is to know both which chunks they belong to AND know the
indices of those chunks relative to some EntityQuery
. There’s a potential
workaround be keeping track of the chunk’s sequence ID, but this converts what
used to be a cache read per entity at any time into three pointer chases not
allowed in parallel to structural changes.
Besides creating a bunch of confusion around whether chunk order determinism
even matters (because now it is really hard to preserve) and what the point of
the sortKey
in ECB is, this change introduces a major curveball in the Latios
Framework’s debugging process.
Kinemation relies heavily on chunk components and caching of relationships. When a bug happens, it is crucial to be able to replay the simulation up to the bug to identify the source of the problem. Without determinism, whether or not two entities lie in the same chunk may change run to run. Thus, if the bug was dependent on two entities being in the same or different chunks, the bug is only reproducible by chance. That’s really, really bad.
Full determinism per architecture was one of Unity’s major competitive advantages over other solutions. And now it is being thrown away. I’m willing to compromise if it makes Game Object/ECS Unification amazing. But you’ll have to forgive me if I am a bit skeptical. I believe there may be other ways to solve the problem. What I ask is that the rules be well-defined regarding the expectations of determinism for maximum correctness that package developers should adhere to. And I ask that based on the rule defined, additional runtime and debugging tools be invested in to support the new rule and accommodate its shortcomings when it comes to chunk-level operations.
It would be awesome if I could see an outline of all functions a Burst job compiled and jump between them. Right now it is still difficult to understand what is happening in critical sections of code in massive jobs.
I want to have a job that takes a variable number of
DynamicComponentTypeHandles
or a variable number of some other custom
container. This is usually to facilitate a highly generalized job that
aggregates a lot of things in a user’s project. Right now, the solution is to
make a job with a whole bunch of type handles or collections, and hope that is
good enough.
There’s currently not a clean way to do the operation of “If this authoring instance is referenced in a list by any other authoring instance, add this runtime component”. You can only do this if you make the restriction that the other authoring instance with the list is an ancestor in the hierarchy. Incremental baking is hard, and I respect that this is not an easy problem to fix. But I will still bring it up here, since it is a limitation a lot of people run into.
Every shadow caster has its own culling callback, and this stresses the job system a lot on the main thread. A similar thing happens for a cubemap and rendering all 6 sides. These are common performance complaints people ask me if I have suggestions for solving.
Subscene import workflows have significant usability issues. Because they occur in a separate Unity process, they do not use Burst, cannot easily be debugged, have limited reporting capabilities of memory leaks and the like, and many engine features are not well tested when accessed in this mode (it took 3 years for the audio crash bug to be fixed).
The Latios Framework pushes the boundaries of what can be baked, with new and exciting high-level features. But that only works when baking itself works, which has been a constant pain point.
I have to move an entity, or more often an array of entities twice if I have both a set of components to add and a set of components to remove.
I believe Dynamic Buffers should be allocated from a custom ECS-managed
allocator and not using Allocator.Persistent
. The allocations are often small
and would be better suited with a pool. This has become a measurable performance
item in the profiler for me. A very common culprit I see is for
LinkedEntityGroup
on prefabs that don’t have children. When the
[InternalBufferCapacity]
was changed from 1 to 0, instantiation started
allocating a 128 byte buffer for each entity, only to use 8 of those bytes. This
is extremely wasteful. Speaking of:
LinkedEntityGroup
internal capacity was changed from 1 to 0, however, it is
still added to solo prefab entities, causing heap allocations every time you
instantiate the prefab. This was a measurable performance regression in one of
my projects, and I had to write a baking system to address it.
If an entity has a component or buffer that contains an entity field that
references the entity itself, and it is instantiated, entity remapping will only
occur if a LinkedEntityGroup
is attached. This is bad, because of the above
problems making LinkedEntityGroup
not cheap, and I want to use a modular setup
where I might want to reference a buffer value by entity and offset, and often
times the entity referenced is a self-reference.
This used to trigger an error. It was finally patched, but the solution was to process each entity one-by-one. Performance is awful.
I frequently run into issues where default root ComponentSystemGroups
accidentally get added to other groups if I don’t explicitly remove them from
the list. Since these are systems that Unity will manually create, they should
have a [DisableAutoCreation]
attribute. At least now because they are
partial
, I am able to fix this with asmref.
If you try to get all systems, the systems with the DisableAutoCreation
don’t
get added to the list even if you specify All
like the XML documentation
suggests. I have to use reflection for this now, which sucks.
Unity has internal APIs to do this which GPU Resident Drawer uses. But I’d really like these to be public API so that I can do registrations efficiently too.
If you have a million entities that each have three children, and a million empty “parents candidates”, and then you randomly reassign the parent of every child, the order of all your entities will be completely nondeterministic and change every time you enter play mode.
These are some of the ugly things I have had to do to make certain framework features work. It would be nice if some of these things could be made more elegant.
We have ICustomBootstrap
for setting up systems at runtime. Why can’t we do
the same thing in the Editor? I ended up using ILPP to hack the code to allow
for this. I also have custom bootstraps for each NetCode world type. And I think
every world Unity creates should allow for a custom bootstrap. I haven’t figured
out how to hack streaming worlds or baking worlds yet, so I end up having to
“post-hack” those with an injected system that installs an IRateManager
to
reshuffle everything.
Then there’s baking. I have a custom Skinned Mesh Rendering solution. Why can’t
I turn off the built-in Entities Graphics Skinned Mesh Renderer baking without
turning off the entire baking of Entities Graphics? (At this point, it is only
the companion GameObject
bakers I want to reuse from Entities Graphics.) Once
again, I hacked this by using a custom baker list mechanism that seemed to be
created for tests.
And while we are on this topic, I would greatly appreciate a flag in
UpdateBefore/After
attributes to suppress warnings about the systems being in
the wrong groups. Such systems might just not be installed at all. A user may
have replaced it with a custom version or something. Or sometimes I have a
system that I want to update in multiple groups, but in one of the groups, it
needs the Before/After attribute, but the Before/After system doesn’t exist in
the second group. Bonus points if they can be suppressed externally.
Personally, I think the whole bottom-up automatic injection design of systems is problematic. It makes it difficult for users to optimize system ordering for better worker thread occupancy, unless they want to decorate their systems with false dependencies. It becomes impossible to know just by looking at the code what the actual order of systems are if there’s a bug where some data is getting changed in the wrong place. And it makes it really hard to copy a system into a different project. Also, how do you define a system to run more than once in a frame?
A top-down approach solves all these problems, and the Latios Framework has the mechanisms in-place to support this. Unfortunately, this conflicts with a lot of existing paradigms. I don’t know the right answer.
Lastly, the whole ICustomBootstrap
thing does not play well with embedded
samples inside of packages. Bootstraps should be settings assets that can be
swapped in the Editor. This feature is planned for a future Latios Framework
version, but I wish I didn’t have to be the one to do it.
Currently the Latios Framework has this SmartBlobber
mechanism for creating
blob assets in baking systems based on a “request” protocol. For each blob type,
the user has to register the type so that a generic system can properly
ref-count and store blobs in the BlobAssetStore
(deduplicating in the
process).
I currently face two problems that I have hacked around. First, adding concrete
types to BlobAssetStore
is not Burst-compatible. I have to use internal APIs
to precompute the type hash prior to the job. Second, I would much rather add
UnsafeUntypedBlobAssetReference
blobs directly so that I don’t need generics.
Honestly, I think the BlobAssetStore
should use Burst’s type hashes instead of
System.Type.GetHashCode
and expose that as API for working with
UnsafeUntypedBlobAssetReference
.
The Latios Framework Smart Blobbers are a powerful concept. They allow baking
systems to generate blob assets without necessarily knowing nor caring how those
blob assets will be used. User bakers can request blob assets to be created.
Baking systems create the blob assets, then pass the blobs back to the user to
do what they please. The issue is how to pass those blobs back to the user
without making the user write a custom baking system, which is error prone. The
solution I came up with is to create a generic baking system and a “bake item”.
The bake item is a stateful IComponentData
which does the original baking, and
then later receives a callback with a reference to EntityManager
and the
primary entity to resolve any blob asset requests and assign them to components.
This works, but it involves generic systems, and it is still somewhat unsafe.
I would like to see a better design in Unity official. Because right now incremental blob baking tends to be fairly buggy for both the Physics and NetCode packages, to the point where people usually just close subscenes before entering play mode. I don’t know what the right solution is, but I’d love to participate in the discussions as it’s a matter I see a lot of people struggle to understand.
In MonoBehaviours
, you can do GetComponent<ISomeInterface>()
. In bakers,
this isn’t possible. Why?
Most of the time, I want a baker to check if some interface exists on the same
Game Object, and if so, early out so that another Baker that processes the
interface can work unhindered. I was able to add this myself, so it is
definitely possible. However, my approach needs to use extension methods rather
than member methods, which is awkward because it requires typing this
a lot.
I started using FixedString
and BlobArray<byte>
in blobs because I couldn’t
log BlobStrings
in Burst-compiled code. There’s a lot of missing APIs and
features for BlobStrings
. Make them better so that I can be more efficient
with my data. And when you do make them better, please tell me.
I feel like I shouldn’t need to write custom code to do this. Some subscenes are critical to be loaded before systems should start running. The player falling through the floor is a common complaint I’ve seen.
Psyshock uses generic jobs in Physics.FindPairs()
using a pattern that allows
Burst to detect and compile the jobs both in the Editor and in builds without
having to explicitly register the generic types with attributes. Unfortunately,
the ILPP can’t pick up on it and patch these jobs to be Burst-schedulable. There
should not be a discrepancy!
Currently I am relying on reflection to find and call the EarlyJobInit()
methods myself for specific generic types. Honestly, I think my solution is
fine, and I’d like to see something similar to it formalized without as many
IL2CPP code-stripping gotchas.
Is there a more performant way to get the raw blend shapes data (the deltas, not the animated parameters) than queueing up a bunch of async readbacks and then batch-completing them inside a baking system?
One of the features of blackboard entities in the Latios Framework is that they
merge components of blackboard config entities whenever a subscene containing
them loads. This allows the user to spread config authoring data across multiple
GameObjects
. However, doing this merging at runtime is surprisingly difficult.
While it is easy to get the ComponentType
list to copy from one entity to
another, it is significantly more difficult to actually copy those types. For
unmanaged components, we have the tools now. But for managed components,
especially shared components, it is problematic. Currently, the Latios Framework
uses reflection, but I would love for there to be a proper
EntityManager.AddComponentFromOtherEntity(Entity src, Entity dst, ComponentType ct)
API so that I can Burst-compile this whole thing.
I have large chunk components. Reading/Writing by ref is way faster. I have asmref extensions to do this, but official support would be better.
Similarly, I’ve also noticed ref gaps for EntityManager
.
I cast the EntityStorageInfoLookup
into EntityTypeHandle
, since they use the
same AtomicSafetyHandle
. I’d appreciate if this was formalized as my
implementation is very spicy.
These are things that bother me, but aren’t detracting from the overall quality of the framework in any meaningful way.
Why does this not exist?
Usually, whether the sign of the value returned is positive or negative means something in the other language APIs this method is supposed to emulate. But currently, that meaning is completely undocumented. All that is documented is whether or not the value is zero.
IJobEntityChunkBeginEnd
doesn’t support a derived interface that uses default
interface methods, because the source generators generate code that directly
calls the methods rather than use a generic static invoker.
This was marked as “as designed”. It annoys me because a different behavior would help mitigate the chunk fragmentation issue I regularly have to work around.
Codegen already injects the ComponentTypeHandles
into IJobEntity
. Can we
have an [Inject]
attribute for lookups and Time
to have codegen do the same?
That would reduce a bunch of boilerplate.
This isn’t possible in idiomatic foreach. I have to have some additional dummy read component around, or deep copy the entity array.
NativeStream
doesn't respect alignment, gets its counts messed up when writing
piecewise but reading in bulk or vice-versa, can't store writes 4kB or greater,
can't defer allocation with a schedule-time known allocation size, ect.
Shared component type handles don’t use lookup caches, but their untyped counterparts do.
These are common user complaints I hear that I resonate with, even if I’m no longer affected the same way.
Why are collections married to singletons? Why are there even singletons? Do you truly only want one of something, or do you just want to know which entity is the entity? The Latios Framework solves these use cases independently with blackboard entities and collection components.
The biggest issue I have with idiomatic foreach is that it is really clunky for
large queries. With Entities.ForEach
, you could put each argument (type and
variable name) on a different line. That doesn’t really work well with idiomatic
foreach. I recognize this is a hard problem, and I don’t have a proposed
solution yet.
But at the very least, make it so that we can have Entity first in the tuple. It is difficult to articulate why, but the Entity being at the end annoys me and most others I talk to.
I keep finding these, and they always catch me off guard in custom bootstraps. Put them where they belong.
This was an incident that caught me off guard. It turns out these indices are always negative, and contain metadata packed inside them. I suspect some of this data wasn’t supposed to reach the public API surface, but it does.