I wrote main() and included the equivalent Sprite variables of the ShaderToys variables at the bottom of my answer.
Customization
To apply a shader to your node, you need to tell SpriteKit to bind the shader to SKSpriteNode in the .fsh file.
- Create an empty text file ending in
.fsh for the shader code.
Swizzle π
shader1.fsh
void main() { vec4 val = texture2D(_texture, v_tex_coord); vec4 grad = texture2D(u_gradient, v_tex_coord); if (val.a < 0.1 && grad.r < 1.0 && grad.a > 0.8) { vec2 uv = gl_FragCoord.xy / u_sprite_size.xy; uv = screenDistort(uv); vec3 video = getVideo(uv); float vigAmt = 3.+.3*sin(u_time + 5.*cos(u_time*5.)); float vignette = (1.-vigAmt*(uv.y-5)*(uv.y-5.))*(1.-vigAmt*(uv.x-.5)*(uv.x-.5)); video += stripes(uv); video += noise(uv*2.)/2.; video *= vignette; video *= (12.+mod(uv.y*30.+u_time,1.))/13.; gl_FragColor = vec4(video,1.0); } else { gl_FragColor = val; } }
- Then add the shader to SpriteKit.
shader1.swift
let sprite = self.childNodeWithName("targetSprite") as! SKSpriteNode let shader = SKShader(fileNamed: "shader1.fsh") sprite.shader = shader
Description
- A shader turns each pixel into an effect color (screenDistort (uv)).
- main () is the entry point.
- gl_FragColor is the return.
- For each pixel in the image, this code is executed.
- When the code is executed, it tells each pixel that the color should be the color of the effect. The call to vec4 () has the values ββr, g, b, a.
ShaderToys variable names β SpriteKit variable names
iGlobalTime β u_time
iResolution β u_sprite_size
fragCoord.xy β gl_FragCoord.xy
iChannelX β SKUniform with name of "iChannelX" containing SKTexture
fragColor β gl_FragColor
Since you have Sprite equivalent variables, now you can easily convert these remaining methods, which are above main() .
float noise {}
float onOff {}
float ramp {}
float stripes {}
vec3 getVideo {}
vec2 screenDistort {}
Theory
Q. Why does main() contain texture2D and u_gradient, v_tex_coord ?
a. SpriteKit uses texture and uv coordinates.
UV mapping
UV mapping is a 3D modeling process for projecting a 2D image onto the surface of a 3D model to display textures.
UV coordinates
When texturing the mesh, you need to tell OpenGL which part of the image should be used for each triangle. This is done using UV coordinates. Each vertex can have a pair of U and V floats above its position. These coordinates are used to access and distort the texture.
SKShader class reference
OpenGL ES for iOS
Shader Recommendations
WWDC Session 606 - What's New in SpriteKit - Shaders, Lighters, Shadows