UIColor

class UIColor : NSObject, NSSecureCoding, NSCopying
  • Initialize a color using a 32-bits integr that represents a color, and an optional alpha component. Only the right-most 24-bits are used, the left-most 8 bits are ignored.

    It is recommended to always include the leading zeros when using a literal color, so as to prevent confusion.

    UIColor(hex: 0x0000FF) // blue color
    

    rather than:

    UIColor(hex: 0xFF) // blue color
    

    Examples

    UIColor(hex: 0xFF0000) // red color
    UIColor(hex: 0x00FF00) // green color
    UIColor(hex: 0x0000FF) // blue color
    

    Declaration

    Swift

    public convenience init(hex: UInt32, alpha: CGFloat = 1)

    Parameters

    hex

    The hexadecimal value to use when initializing the color. The left-most 8 bits are ignored.

    alpha

    The alpha value to use when initializing the color. Defaults to 1

  • Initialize a color using a hexadecimal string (case insensitive), with an optional # or 0x prefix.

    Examples

    UIColor(hexString: "FF0000") // red color
    UIColor(hexString: "#00ff00") // green color
    UIColor(hexString: "0x0000FF") // blue color
    UIColor(hexString: "0x0000FG") // nil
    UIColor(hexString: "FF000") // nil
    UIColor(hexString: "#FF000") // nil
    UIColor(hexString: "0xFF000") // nil
    

    Declaration

    Swift

    public convenience init?(hexString: String, alpha: CGFloat = 1)

    Parameters

    hexString

    The hexadecimal string to use when initializing the color. The string may start with 0x and # and then must contain exactly 6 characters. Any invalid characters will result in the initializer failed.

    alpha

    The alpha value to use when initializing the color. Defaults to 1