• Welcome to Photrio!
    Registration is fast and free. Join today to unlock search, see fewer ads, and access all forum features.
    Click here to sign up

Free Kodak Photo Paper * LOOK *

Exactly my point; that said I will point out to anyone still struggling with the problem that actually you need no knowledge of any specific programming language or syntax or even programming full stop to get this
one - basic maths will see you through. I chose the example carefully for that reason.


Of course, it's absurd and unfair to expect APUGgers to spot something that is blindingly obvious to someone who programs for a living. Almost as absurd and unfair as to expect Joe Public to spot something which is blindingly obvious to the significantly smaller set 'people who've ever set foot in a darkroom.'
 
I'm far from stupid, but I have done some incredibly stupid things. There's a thread on here about stupid things done in a darkroom, and I'm sure we could all contribute to it.
 
I'm afraid I'm not into software and don't write code at all, but I do know that hexademical is base 16 - yet the code provided assumes hex is to base 17.

Traditionally 0x09 + 0x01 = 0x0A !
 
My only code writing experience is 30+ years old (remember Algol-W?) but I have some idea of what you are talking about.

I believe it was Dijkstra who said, "Algol is an unique programming language, in that it is a vast improvement over its descendants."

Steve
 


I think you missed it too. It has nothing to do with the StringBuffer. The Array that makes up Hexadecimal values includes '10', ten. That is two digits thus making 6 or 7 character sb depending on if '10' was included. The list should be:
0,1,2,3,4,5,6,7,8,9,a,b,c,d,e,f

If the datatype was the correct single char type not String the error would be clear as '10' does not fit in a single char.

And yes a common bug.
 

Funny yes. Even more funny the typical approach your average C++ (the "digital" of C) programmer takes to use such a ridiculously complex method in general.

Here's the straightforward ANSI C version that even has output:

#include <stdio.h>
#include <stdlib.h>

int main(void)
{
char q[] = "0xffffff";

sprintf(q, "0x%6.6x", rand() & 0xffffff);
fprintf(stdout, "%s\n", q);

return 0;​
}
 
If you've ever written embedded software you'd know that many embedded C libs only implement a subset of printf functionality, and for good reason - as a rule of thumb printf is almost never the most efficient way to do anything. Provided StringBuffer is sanely implemented, an approach involving byte-by-byte concatenation of a singly generated long (i.e. absolutely you don't want to execute rand() 6 times) may well be quicker (after all, what do you think printf is doing if not exactly that only with the added overhead of format string parsing) and quite possibly significantly more readable.

Anyway, to say C++ is 'the digital version of C' is logically absurd. A passing knowledge of Turing tells you the two languages are completely interchangeable (and of course early C++ compilers simply translated the code to C.) As tools, both languages can be used to write excellent code or awful code.


Which is to say, the faults in the code - and modern programming generally - lie not in the languages, but in the programmers.


Mind you - I prefer Objective C anyway...