Re: About programing, a general question

On Fri, Dec 17, 2010 at 3:54 AM, Les <hlhowell@xxxxxxxxxxx> wrote:
On Fri, 2010-12-17 at 10:00 +0000, John Haxby wrote:

On 17 December 2010 09:41, Ralf Corsepius <rc040203@xxxxxxxxxx> wrote:

        That said, I'd choose "C" to getting started. It's a bit of a
        rough ride
        in the beginning, but it pays off in longer terms.

Actually, no, C is dead easy to start but it gets really difficult
really quickly.  Consider these for a beginner:

  * Write the declaration of signal(3) -- it takes two parameters, an
integer and a pointer to a function that takes an integer paramater
and returns void.  Explain why the parentheses are needed.

 * Why does "a + b == 0" work the way you expect but "a & b == 0" does
not?  Are you sure it doesn't?

 * What is the difference between "const char *s" and "char * const

 * What is the difference between "char *s" and "char s[]"?

Admitedly the very first of these is not likely to come up as a
beginner, but the other three will, and they'll bite you good and

C is not a simple language, it has a lot of subtlety and it is
incredibly expressive, but I would not use it as the beginning
language for someone who wants to learn to program.  I'd start with a
language that was designed carefully.  There aren't any Algol68
compilers any more :-) but I'd choose python or java to learn to
program.  Once you know what you want to do then you can go for
something else, something applicable to what you want to do.  When you
know the basics those questions about C are still difficult, but at
least you're not trying to understand them at the same time as knowing
what happens to a parameter when you pass it to a function or, for
that matter, what a function is.


I have taught 4 different languages professionally, BASIC, PASCAL,
FORTRAN(actually a language with syntax similar to FORTRAN), and C.  I
prefer to teach people C.

       I am not a real C guru, but I can tell you that it has the power to
either great abstraction, or to be very close to the machine.  It
depends on how the programmer views the problem.  I know that there are
people who will argue with me on that point, but lets not get
sidetracked here.  The goal of a programmer is to produce useful working
code that does the jobs needed to be done.

       There are anomalies in all languages.  Some of them are inherent to the
language, some are due to the compilation process, in the conversion
from a somewhat human recognizable syntax to machine code in several

       So to talk about the things that have bothered one person in one
language is not really material at this point.

One could produce many examples from c aside from the ones you have
waved off where even experts disagree as to what c syntax is supposed
to mean, and most of the code I've seen is peppered with potential
mysteries as to why an asterisk or ampersand appears or does not
appear where it does.

And in some cases those
issues can compiler dependent (as the A+b==0 case where the potential
errors come from whether a and b are integer, float, double or long
double, and what is the defined value of true (b==0 can be 1 or -1,
integer or long.  If 1, it is binary 0000 0000 0000 0001 and -1 is 1111
1111 1111 1111 for example)  And the C definition makes char and byte
synonymous at this time, although new character sets are represented

       So the discussions of evaluation are compiler dependent and in most
cases somewhat machine dependent as well.

       My view is of a programmer who does mostly embedded type stuff, so it
is different from a person who specializes in say graphics, or
databases, or even text manipulation, or perhaps a medical or biological
programmer.  Furthermore, some programmers specialize in mathematical
areas, such as filtering, or high precision, or signal processing.
Others work in theoretical physics.

       Choosing Python to manage long chain mathematics would probably not be
too efficient or productive.

Unless you become adept at calling compiled mathematical libraries
(probably written in c or Fortran) from Python, which is exactly why
Python is relatively popular as a scientific/mathematical scripting

Fortran would not be good for text
manipulation.  Lisp would not make a good report generator (in my
opinion anyway), APL would not be good for database administration.

       I do all kinds of programming in C.  I cannot do a full database from
scratch in C, and would probably use SQL in some form for that, but for
most "quick hack" tools I still use C from the command line.  Most of my
programs are repeat use, but require very little interaction and little
in the way of feedback (think YUM as an example).

       When working on microcontrollers I use Assembly for most programs.
When working in my professional field as a Test Programmer, I rely on C,
BASIC, assembly and machine language (bit control of parts is required
to test them).

       This is all to say that the ultimate language you use will be mostly
determined by your career path, but you will most likely use more than

       To learn, I recommend Intel Assembler code and C. Mostly because they
are somewhat good examples of both power and complexity, and can produce
quick simple programs that do useful things and represent real

And many of the things that are dangerous/subtle/hard-to-learn about c
are much less obscure if you are an assembly-language programmer. In
fact, they seem quite natural. Trouble is, they will *not* seem
natural to a beginning programmer.

users mailing list
To unsubscribe or change subscription options: