28 lines
1.9 KiB
Plaintext
28 lines
1.9 KiB
Plaintext
Mono path[0] = 'C:/Users/noahk/Desktop/BensonV3/Racesm_L_Data/Managed'
|
|
Mono config path = 'C:/Users/noahk/Desktop/BensonV3/MonoBleedingEdge/etc'
|
|
[Physics::Module] Initialized MultithreadedJobDispatcher with 15 workers.
|
|
Initialize engine version: 2022.3.11f1 (d00248457e15)
|
|
[Subsystems] Discovering subsystems at path C:/Users/noahk/Desktop/BensonV3/Racesm_L_Data/UnitySubsystems
|
|
Forcing GfxDevice: Null
|
|
GfxDevice: creating device client; threaded=0; jobified=0
|
|
NullGfxDevice:
|
|
Version: NULL 1.0 [1.0]
|
|
Renderer: Null Device
|
|
Vendor: Unity Technologies
|
|
Begin MonoManager ReloadAssembly
|
|
- Loaded All Assemblies, in 1.004 seconds
|
|
- Finished resetting the current domain, in 0.003 seconds
|
|
ERROR: Shader Sprites/Default shader is not supported on this GPU (none of subshaders/fallbacks are suitable)
|
|
Microsoft Media Foundation video decoding to texture disabled: graphics device is Null, only Direct3D 11 and Direct3D 12 (only on desktop) are supported for hardware-accelerated video decoding.
|
|
ERROR: Shader Sprites/Mask shader is not supported on this GPU (none of subshaders/fallbacks are suitable)
|
|
ERROR: Shader Legacy Shaders/VertexLit shader is not supported on this GPU (none of subshaders/fallbacks are suitable)
|
|
WARNING: Shader Unsupported: 'Standard' - All subshaders removed
|
|
WARNING: Shader Did you use #pragma only_renderers and omit this platform?
|
|
WARNING: Shader If subshaders removal was intentional, you may have forgotten turning Fallback off?
|
|
ERROR: Shader Standard shader is not supported on this GPU (none of subshaders/fallbacks are suitable)
|
|
WARNING: Shader Unsupported: 'Standard' - All subshaders removed
|
|
WARNING: Shader Did you use #pragma only_renderers and omit this platform?
|
|
WARNING: Shader If subshaders removal was intentional, you may have forgotten turning Fallback off?
|
|
UnloadTime: 0.821500 ms
|
|
Couldn't connect to trainer on port 5054 using API version 1.5.0. Will perform inference instead.
|