Code: Select all
fade_in_duration :: uint8 (optional)
fade_out_duration :: uint8 (optional)
scale_end :: float (optional)
scale_initial :: float (optional)
Use case: I have some relatively large sprites (the largest is 16 x 4 tiles) that are rendered to make an animation (the lightning strikes below). I really just want it to fade away. I can't use an explosion because the sprite needs to be scaled along its X axis only, depending on the distance it needs to be drawn over (the bolt uses an L-system algorithm to connect two points); it also needs to be rotated with precision.
Currently I have two options:
1. Make it an animation, which means hugely multiplying the sprite atlas usage of the effect when the main thing it needs to do is simply fade away. There are 8 variations of 3 sizes of sprite and that's already a non-insignificant use of atlas space even with just 1 frame. It needs at least 30 steps to look good.
2. Make it a sprite, and use conditional on_tick shenanigans over the lifetime of the sprite to set the alpha and multiply the RGB channels by the alpha.
The latter is what I'm currently doing, and taking the opportunity to slightly adjust the y_scale of the sprite as well so it distorts as it fades. Performance is actually not too bad, but obviously it's worse if there are many of these things firing at once. The turret entity in question is inactive most of the time in a typical game but in theory someone could spam hundreds of them down in hotspots which could tank UPS.
Request: allow us to set initial tint and final tint for a sprite; if they are defined, interpolate between them during the sprite's lifetime.
Bonus points: allow x_scale_initial, x_scale_final, y_scale_initial and y_scale_final as well which are also interpolated over the TTL.