Don't forget to use `unsigned char` while working with bytes

The other day I made the mistake of using type `char` in a buffer accepting bytes read from an image file, specifically a PNG. The correct script is below for your perusal.
/* Test file reading and see if there's random data */

#include <stdio.h>
#include <stdlib.h>
#include <malloc.h>


    char fname[] = "../images/2.png";

    FILE *fp = fopen(fname, "rb");
    if (fp == NULL) abort();
    unsigned char *buffer = malloc(sizeof(char) * PNG_BYTES_TO_CHECK);

    if (fread(buffer, 1, PNG_BYTES_TO_CHECK, fp) != PNG_BYTES_TO_CHECK)

    unsigned i;
    for (i = 0; i < PNG_BYTES_TO_CHECK; ++i) printf("%.2X ", buffer[i]);


    return 1;
The idea is to ensure that the 8 signature bytes match those of a valid PNG image, viz.
89 50 4E 47 0D 0A 1A 0A
(the bytes 50 4E 47 represent ascii characters P, N and G, resp.). However, changing `unsigned char` to `char` results in the following (wrong) first number:
FFFFFF89 50 4E 47 0D 0A 1A 0A
It's bigger than a `signed char` can hold I suppose. Reading around, I see that it may just be undefined behavior.

I see that, as a signed number it's -119, and since \({89}_{16}\equiv {137}_{10}\), we have 9 difference between the maximum an unsigned byte can represent, \(+128\). Adding this to the floor of a signed byte, -128, we get -119.