Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.syntblaze.com/llms.txt

Use this file to discover all available pages before exploring further.

A signed char is a fundamental integer data type in C that occupies exactly one byte of memory and is explicitly designated to represent signed numerical values. It is the smallest addressable signed integer type in the C language.

Technical Specifications

  • Size: Guaranteed to be exactly 1 byte. The number of bits in this byte is defined by the CHAR_BIT macro in <limits.h> (universally 8 bits on modern architectures).
  • Memory Representation: The Most Significant Bit (MSB) acts as the sign bit. Modern C implementations utilize two’s complement representation for negative values.
  • Value Range:
    • The C Standard guarantees a minimum range of -127 to 127.
    • On standard 8-bit two’s complement systems, the exact range is -128 to 127.
    • These limits are exposed via the SCHAR_MIN and SCHAR_MAX macros in <limits.h>.

Type Distinctness

In the C type system, char, signed char, and unsigned char are three mutually exclusive, distinct types. Even if a specific compiler architecture defaults the standard char type to behave as a signed integer, char and signed char remain distinct types. Consequently, a pointer to char (char *) is not strictly compatible with a pointer to signed char (signed char *) without an explicit cast.

Syntax and Initialization

#include <stdio.h>
#include <limits.h>

int main(void) {
    // Declaration and initialization
    signed char a = 100;
    signed char b = -50;
    
    // Character literals are of type 'int' in C, but can be safely 
    // truncated into a signed char if within the representable range.
    signed char c = 'Z'; 

    // SCHAR_MIN and SCHAR_MAX define the architectural limits
    signed char min_limit = SCHAR_MIN;
    signed char max_limit = SCHAR_MAX;

    return 0;
}

Format Specifiers and I/O

When interacting with standard I/O functions like printf or scanf, formatting a signed char requires precise combinations of length modifiers and conversion specifiers:
  • printf and %hhd: Because printf is a variadic function, a signed char argument is subject to default argument promotions and is always passed as an int. In the format string %hhd, d is the conversion specifier for a signed decimal integer, and hh is the length modifier. The hh modifier instructs printf to take the promoted int argument and convert it back to a signed char before formatting.
  • scanf and %hhd: To read a numeric value directly into a signed char variable, use %hhd. This correctly expects a pointer of type signed char *.
  • The %c Conversion Specifier: The c conversion specifier is used for character data. While passing a signed char to printf("%c", val) is valid (due to promotion to int), using %c with scanf strictly requires a char * argument. Passing a pointer to a signed char (signed char *) to scanf with %c results in a pointer type mismatch and will generate compiler warnings (e.g., -Wformat).
signed char val = -120;

// printf takes the promoted int, and 'hh' converts it back to signed char
printf("Integer value: %hhd\n", val);

// scanf requires the exact length modifier 'hh' and conversion specifier 'd'
// to safely write to a signed char pointer
scanf("%hhd", &val);

Integer Promotion

When a signed char is used in an arithmetic expression, it is subject to integer promotion. Before the operation is evaluated, the signed char is implicitly promoted to an int (preserving its sign and value). The result of the arithmetic operation will be of type int unless explicitly cast back to signed char.
signed char x = 100;
signed char y = 50;

// x and y are promoted to int before addition.
// The result (150) fits in an int, preventing overflow during the operation.
int result = x + y; 
Master C with Deep Grasping Methodology!Learn More