• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

Solved Am I too stupid?

G

giahung1997

Guest


#1
I've introduced myself in my first post on this forums so I don't want to copy paste it here. For short, I quit school at grade 7 and after that self taught.

Recently I played with Clang on FreeBSD, thinked that I've a full system with a decent compiler, why don't try to learn some C? And I started to learn C. After a while I stopped. So many Cs... C89, C99, C11... This language even doesn't have builtin boolean type, I've to include <stdbool.h> (thanks stackoverflow). This language doesn't have fixed data type either, int could be 4 bytes or 8 bytes depends on the system, to archive fixed types I've to include <stdint.h> or <inttypes.h> (again thanks SO). This language is just ... a mess. Or just I'm stupid? After reading a summary about C99 vs C11 on SO my head going to explode, plus the GNU extensions... I decided to quit. To my simple mind I think this language is a hacking languaged, the language of engineers, people that only want something up and do the job, and patch, trick, hack it to accomplish the job, they've not designed it to be clean, beautiful to other non-technical people can understand and use, but their own tool to get the job done then throw away do not take care for it anymore, adding more and more tricks, extensions to it to adapt to build new things but never care about consistency... so does C++. It's too complex for my stupid head. I'm too stupid.
 

drhowarddrfine

Son of Beastie

Thanks: 792
Messages: 2,575

#2
C is a simple language designed to do whatever you want it to do. It does not confine you to anything and you can define anything you want. It is as close to the metal as you can get without using assembly language or machine code. It will not hold your hand or protect you from anything. If you want to shoot yourself in the foot, it will let you with great speed.

Your complaint is sort of the same as complaining about the alphabet. It's just a bunch of letters with no defined words. But you can make your own words and definitions.

If you want or need anything above all that, you either create it yourself or use a different programming language.
 

JAW

Member

Thanks: 13
Messages: 29

#3
C is pretty archaic, it's several decades old afterall, however it does have a purpose. It's all about using the right language for the job. If you want to build an enterprise app or website then look at a managed language (Java or C#), if you want to do some data analysis then take a look at R, some simple cross platform scripting, Python. C has its place among operating systems, device drivers, embedded, and game engines etc. where you need that lower level access and efficiency.
 

drhowarddrfine

Son of Beastie

Thanks: 792
Messages: 2,575

#4
C has its place among operating systems, device drivers, embedded, and game engines etc. where you need that lower level access and efficiency.
So C is not so "archaic" after all.
 
G

giahung1997

Guest


#5
C is a simple language designed to do whatever you want it to do. It does not confine you to anything and you can define anything you want. It is as close to the metal as you can get without using assembly language or machine code. It will not hold your hand or protect you from anything. If you want to shoot yourself in the foot, it will let you with great speed.

Your complaint is sort of the same as complaining about the alphabet. It's just a bunch of letters with no defined words. But you can make your own words and definitions.

If you want or need anything above all that, you either create it yourself or use a different programming language.
It's not simple. It's a mess, like lost in a forest. The alphabet is only one is really simple. Your C is never one nor simple.

Everything you said no conflict with me, I said C is a language of engineers freely to hack, to patch, to use tricks to get what they want. It's never simple. It's never consistent. It's never for people don't have CS background like me. So I know I'm stupid, I quit.

Engineers' tools always a bunch of tricks and hack, like your VIM, simple minded like me like nano or ee.

C is pretty archaic, it's several decades old afterall, however it does have a purpose. It's all about using the right language for the job. If you want to build an enterprise app or website then look at a managed language (Java or C#), if you want to do some data analysis then take a look at R, some simple cross platform scripting, Python. C has its place among operating systems, device drivers, embedded, and game engines etc. where you need that lower level access and efficiency.
I want to do something with manual memory management. I'd use Pascal since so far I searched, what C can, it can too. More consistent and really simple, despite your engineers always said it's a toy or typing too much. I'm not engineers so I use that toy, why so serious?

So C is not so "archaic" after all.
No way, C is old, accept it. It's still used not because it's good, nor modern. Like Javascript, do you think it's awesome? It's used because it's the only way to go, browser speak JS. The same for your C.

I don't want to spread war to get my acc banned. Marked as solved.
 

leebrown66

Well-Known Member

Thanks: 102
Messages: 298

#6
If you ever feel like giving it another go, I highly recommend "The C Programming Language" authored by Brian W. Kernighan & Dennis M. Ritchie AS AN INTRODUCTION. The original authors of the language, this book walks you through quite simply. It was what all programmers used to learn C because it was the only book available 'back in the day'. Heh I'm showing my age now.

It will teach you the foundation upon which all those extensions are based.
 

ShelLuser

Son of Beastie

Thanks: 1,185
Best answers: 5
Messages: 2,552

#7
Everything is difficult or seems chaotic, until you fully grasped the way it works.

I see your "C complaints" and can place them almost literally next to complaints which I've seen people make about Java (which happens to be one of my favorite languages). Most notably the way the Import statements work or how certain methods behave in a completely different way than others, stuff like that. "No consistency" some people say, while in fact there's plenty.

The main issues are basically to fully understand the system (the motivation behind it) as well as trying to keep the larger view in mind. If you focus on the lack of a boolean type then sure, that could seem dumb. But if you try to look at the larger perspective you'll see that because it doesn't have one it basically allows to you include and/or use any type you'd want by merely including it. This also means it would allow for types to be used which haven't even been invented yet.

So instead on focusing on what the language can do you'd have to focus on what you can make it do.

Of course it also helps to try and focus on the good parts instead of the bad :) I can understand the frustration, but as said: where you see limitations ("no out of the box support") I only see opportunities ("allows for supporting anything you want").
 

CraigHB

Member

Thanks: 25
Messages: 95

#8
I've done a bit of programming in my life, first with Fortran in college during the main frame days and then Pascal and Basic on PC after going back to college later. I've done some assembly with embedded most recently. I simply can't get my head around C. Every time I learn how to do something with it I forget a week later. It's like learning Latin. Anything else I can jump back into with a few references here and there. C gives me a big headache. Just too bad for me it's so ubiquitous.
 

drhowarddrfine

Son of Beastie

Thanks: 792
Messages: 2,575

#9
Computers are old. Windows is nearly 40 years old. "Archaic" implies out-of-date and nothing is further from the truth. That is my complain when someone says it's archaic yet it's everywhere and runs nearly everything or parts of it.

When one struggles with C, I usually find they don't have an understanding of how a computer works. At least not as deep as communication bet ween the processor, memory, IO, etc. I started life as an electronic engineer who designed and built computers from scratch using TTL logic starting with bit-slice processors and the 8085A. So, when I program, I find myself visualizing the flow of data from the CPU through the data lines to memory and other IO and back again. All this is second nature to me. C makes sense.

What I struggle with, sometimes, is the abstraction of higher level languages. They can drive me crazy.
 

JAW

Member

Thanks: 13
Messages: 29

#10
Archaic does not mean it is no longer used: http://www.dictionary.com/browse/archaic

Indeed I write in C most days, never made the move to C++ and instead skipped over to Java and C# for my OO (and employment) needs. I agree that you need to have good understanding of how a computer works when working with C, when I write stuff in higher level languages I often think how that would work if I were to implement it in C instead. :)

Edit: Sorry, can't get the HTML anchor to work!
 

trev

Aspiring Daemon

Thanks: 100
Messages: 587

#11
[C is] never for people don't have CS background like me.
I taught myself C back in the day of FidoNet BBS systems and PC-DOS and my background was law :) I don't think you need a CS background, but what you do need is a reason/motivation to learn C. For me, it was to be able to write various utilities for my BBS that did not otherwise exist.

I avoided Windows programming for years until I needed to write a tax calculator for Windows for my legal writing job (conversion of paper product into CD-ROM product). I looked at various options and ultimately chose Delphi v1.0 (oo Pascal) for its simplicity for the job and taught myself that.

Later I wrote an editorial publishing system for the publisher I worked for on a VAX minicomputer in DEC Command Language (DCL command line scripts) to make my life as a writer easier.

A couple of years later the company moved to Sun OS on Sun hardware and I rewrote the publishing system as a web application using the NCSA web server for the front-end GUI while the back-end was almost exclusively written in Bourne shell scripts with a bit of C and Perl for interfacing to the typesetting system.

To conclude, I hope I've demonstrated that: (1) You need a reason to learn a langauge; (2) You need to pick the right language for the job.
 

ralphbsz

Daemon

Thanks: 609
Best answers: 3
Messages: 1,058

#14
So many Cs... C89, C99, C11...
Pick one. You can pick the most recent one that is supported on all the machines you are intending to use; or you can go with a classic old one (like the ANSI C dialect of the K&R book), which is also supported on all machines.

Also, I wouldn't learn C today, I would learn C++ instead. It is nearly perfectly a superset of C (so everything in C books will work in a C++ compiler), and from there you can progress to object-oriented programming.

This language even doesn't have builtin boolean type, I've to include <stdbool.h> (thanks stackoverflow).
If you want a boolean type in C, you are not really programming C. You are programming some other language that you have a mental model about, and trying to express the thoughts in C. It would like trying to speak french, and using french grammar and sentence structure, but english words: the result is unlikely to be satisfying. In C, you can just use the "int" type for true/false values

This language doesn't have fixed data type either, int could be 4 bytes or 8 bytes depends on the system, ...
So? If you want to use a signed integer type, use "int"; if you want an unsigned integer type, use "unsigned int", and if you are interested in efficiency (*) but willing to have a smaller range of values, use "short" (perhaps unsigned). (* Warning: on modern RISC-style hardware, which includes Intel/AMD chips with modern compilers, short is not always more efficient than int). Using "int" gives you the natural and efficient word size of your machine.

Why exactly do you want fixed-size types? The only reason is if you are doing something system-dependent (like interfacing to hardware, or to protocols that come from other systems), and then you will need to use headers.

This language is just ... a mess.
Yes, it is a mess. And C++ is an even worse mess. That's because at their core, C and C++ are very old languages (I'm old enough to have met Ritchie, Kernighan and Thompson, and they are indeed very old (one of them has passed away). In the early days of C, that language was sufficient for system programming (matter-of-fact, early Unix versions were written in pretty straight C with few extensions). But as modern features have been added, they were made much more complex; but for compatibility reasons, few of the old features could be removed. The one big transition that was a great cleanup (and intentionally incompatible) was the transition from K&R C to ANSI C (when function prototypes came in); since then, the language has just gotten messier. It's even worse for C++, which has carried a huge amount of baggage since the mid-1980s. This is the reason that I think C doesn't make a good teaching language (students have to dig through arcane pointless details), and its use as an implementation language is also on the decline.

plus the GNU extensions...
With a modern C, most of the GNU extensions have become less necessary, as their functionality has been absorbed into the language. You can just ignore them; it's perfectly possible to write functioning code without them.

... this language is a hacking languaged, the language of engineers, ...
Correct.

... they've not designed it to be clean, ...
It is possible to write really clean and clear software in C. But it is a lot harder to do it than in other languages, because C will always be verbose. For example, you haven't even talked about string processing in C, and how it requires constantly allocating and freeing memory manually, which makes it really error-prone. But if done carefully, it can be done.

Also, the early C (like the one described in the second edition of the K&R book) is much simpler and cleaner than modern ones.

I would suggest that if you want to learn programming, you start with a different language. I'm particularly fond of Python these days, but there are many other options.
 
G

giahung1997

Guest


#15
Pick one. You can pick the most recent one that is supported on all the machines you are intending to use; or you can go with a classic old one (like the ANSI C dialect of the K&R book), which is also supported on all machines.

Also, I wouldn't learn C today, I would learn C++ instead. It is nearly perfectly a superset of C (so everything in C books will work in a C++ compiler), and from there you can progress to object-oriented programming.


If you want a boolean type in C, you are not really programming C. You are programming some other language that you have a mental model about, and trying to express the thoughts in C. It would like trying to speak french, and using french grammar and sentence structure, but english words: the result is unlikely to be satisfying. In C, you can just use the "int" type for true/false values


So? If you want to use a signed integer type, use "int"; if you want an unsigned integer type, use "unsigned int", and if you are interested in efficiency (*) but willing to have a smaller range of values, use "short" (perhaps unsigned). (* Warning: on modern RISC-style hardware, which includes Intel/AMD chips with modern compilers, short is not always more efficient than int). Using "int" gives you the natural and efficient word size of your machine.

Why exactly do you want fixed-size types? The only reason is if you are doing something system-dependent (like interfacing to hardware, or to protocols that come from other systems), and then you will need to use headers.


Yes, it is a mess. And C++ is an even worse mess. That's because at their core, C and C++ are very old languages (I'm old enough to have met Ritchie, Kernighan and Thompson, and they are indeed very old (one of them has passed away). In the early days of C, that language was sufficient for system programming (matter-of-fact, early Unix versions were written in pretty straight C with few extensions). But as modern features have been added, they were made much more complex; but for compatibility reasons, few of the old features could be removed. The one big transition that was a great cleanup (and intentionally incompatible) was the transition from K&R C to ANSI C (when function prototypes came in); since then, the language has just gotten messier. It's even worse for C++, which has carried a huge amount of baggage since the mid-1980s. This is the reason that I think C doesn't make a good teaching language (students have to dig through arcane pointless details), and its use as an implementation language is also on the decline.


With a modern C, most of the GNU extensions have become less necessary, as their functionality has been absorbed into the language. You can just ignore them; it's perfectly possible to write functioning code without them.


Correct.


It is possible to write really clean and clear software in C. But it is a lot harder to do it than in other languages, because C will always be verbose. For example, you haven't even talked about string processing in C, and how it requires constantly allocating and freeing memory manually, which makes it really error-prone. But if done carefully, it can be done.

Also, the early C (like the one described in the second edition of the K&R book) is much simpler and cleaner than modern ones.

I would suggest that if you want to learn programming, you start with a different language. I'm particularly fond of Python these days, but there are many other options.
Something wrong with your post I see bgcolor error everywhere, very difficult to read, perhaps the forum's fault? Your post is long but useful, I tried and read all.

I come from Turbo Pascal. Vietnam use this language to teach student programming in the school, which has built in bool, I used to think every language in the world must have this data type.

I want fixed width int because I don't want to see unexpected error (silly error, if you program on limited embbed device use the manufacture's compiler and subset of C), I want my app run on PC only why I have to suffer this craziness? I always include <inttypes.h> and use uint64_t instead of unsigned long.

The C standard is a mess. Books and Tuts written for other version of it, can not be used with the latest.

EX: for old day this thing is correct: scanf("%ul", &a); now you must include <inttypes.h> and use this scrap: scanf("%" PRIu64, &a); which is a dirty hack with macro PRI* predefined in the header, PRIu64 even not in the "", this ugly syntax make me ill.

About python, it seemed by head is crazy I find this language to be difficult, really, I'm not joke. I can never get it.
 
G

giahung1997

Guest


#16
I've done a bit of programming in my life, first with Fortran in college during the main frame days and then Pascal and Basic on PC after going back to college later. I've done some assembly with embedded most recently. I simply can't get my head around C. Every time I learn how to do something with it I forget a week later. It's like learning Latin. Anything else I can jump back into with a few references here and there. C gives me a big headache. Just too bad for me it's so ubiquitous.
I'm too. Other people complain much of C because of pointer, I can use it easily but for the C datatypes I forget it right away after a long sleep :)
 

ralphbsz

Daemon

Thanks: 609
Best answers: 3
Messages: 1,058

#17
Something wrong with your post I see bgcolor error everywhere,
Known bug in the forum software. Admin is working on it. In the meantime, I edited my post to remove them. Sorry about that.

I come from Turbo Pascal. Vietnam use this language to teach student programming in the school, which has built in bool, I used to think every language in the world must have this data type.
When in Rome, do like the Romans. When using C, write idiomatic C, meaning in the usual style that C is used for. That does not need a "bool" type. When writing Fortran, write idiomatic Fortran, same with COBOL/Ruby/...

I want fixed width int because I don't want to see unexpected error (silly error, if you program on limited embbed device use the manufacture's compiler and subset of C), I want my app run on PC only why I have to suffer this craziness? I always include <inttypes.h> and use uint64_t instead of unsigned long.
Do you seriously expect to ever count to 64 bit numbers? Probably not, as it would take a long time.

When programming, you need to know the natural limits of your numbers. For example, if an integer is a counter of a number of real-world things (like when doing inventory control, how many widgets are in the box of widgets), you'll know the real-world maximum. Then you need to actually learn C, and know the rules: short and int are guaranteed to be at least 16 bit, long at least 32 bits. Then use the appropriate type for your data. The compiler will make it efficient. And write code for enforcing the limits: if you know that the counter of widgets is only supposed to go to 16 bits, then there should be an if statement in your code that prints an error message is a human tries to create the 32768'th or 65536'th widget. Mindlessly using 64-bit types everywhere is wasteful, and demonstrates that you don't understand your problem domain.

The only time that you use inttypes.h and types like uint64_t is if the hardware or some data interchange force you into that data type. Otherwise, the idiomatic thing to do is to use the native types (if you really need 64 bits, which does happen occasionally, use the type "long long").

EX: for old day this thing is correct: scanf("%ul", &a); now you must include <inttypes.h> and use this scrap: scanf("%" PRIu64, &a); which is a dirty hack with macro PRI* predefined in the header, PRIu64 even not in the "", this ugly syntax make me ill.
That's only because you are insisting to use the uint64 type. If your number were an int, then "%d" would work, and if it were an unsigned int, then "%u" would work.

About python, it seemed by head is crazy I find this language to be difficult, really, I'm not joke. I can never get it.
There are many tastes in programming languages. For example, I really don't like Perl at all; the syntax and the usual idioms make my skin itch. I happen to like Python. A friend tried to convince me to program in Ruby, and I did that for about 3 months; I could probably start liking it after a while, but Python still feels more natural to me. I actually don't like C++ and C (much prefer Java if you need a non-scripted OO language), but because of the reality of the job market, I end up doing a lot of programming in C++. The difference is: I get paid for this stuff, so I program in whatever language is needed at the time.

If you don't like Python, then don't learn it. Try learning another scripting language for fun.
 
G

giahung1997

Guest


#18
Known bug in the forum software. Admin is working on it. In the meantime, I edited my post to remove them. Sorry about that.


When in Rome, do like the Romans. When using C, write idiomatic C, meaning in the usual style that C is used for. That does not need a "bool" type. When writing Fortran, write idiomatic Fortran, same with COBOL/Ruby/...


Do you seriously expect to ever count to 64 bit numbers? Probably not, as it would take a long time.

When programming, you need to know the natural limits of your numbers. For example, if an integer is a counter of a number of real-world things (like when doing inventory control, how many widgets are in the box of widgets), you'll know the real-world maximum. Then you need to actually learn C, and know the rules: short and int are guaranteed to be at least 16 bit, long at least 32 bits. Then use the appropriate type for your data. The compiler will make it efficient. And write code for enforcing the limits: if you know that the counter of widgets is only supposed to go to 16 bits, then there should be an if statement in your code that prints an error message is a human tries to create the 32768'th or 65536'th widget. Mindlessly using 64-bit types everywhere is wasteful, and demonstrates that you don't understand your problem domain.

The only time that you use inttypes.h and types like uint64_t is if the hardware or some data interchange force you into that data type. Otherwise, the idiomatic thing to do is to use the native types (if you really need 64 bits, which does happen occasionally, use the type "long long").


That's only because you are insisting to use the uint64 type. If your number were an int, then "%d" would work, and if it were an unsigned int, then "%u" would work.


There are many tastes in programming languages. For example, I really don't like Perl at all; the syntax and the usual idioms make my skin itch. I happen to like Python. A friend tried to convince me to program in Ruby, and I did that for about 3 months; I could probably start liking it after a while, but Python still feels more natural to me. I actually don't like C++ and C (much prefer Java if you need a non-scripted OO language), but because of the reality of the job market, I end up doing a lot of programming in C++. The difference is: I get paid for this stuff, so I program in whatever language is needed at the time.

If you don't like Python, then don't learn it. Try learning another scripting language for fun.
No I don't use 64 bit int as counter. For counter, I use uint8_t or uinit_16t. Because I know it's very easy to ruin everything in C so I use very strict rule for my app like what I used with Pascal. For example I want to scan an interger > 0 I will use uinit64_t not int or long because it allow negative. Everything I placed in a do while loop (stupid trick but I think exception handling in C or even C++ sucks). I follow this discipline. Writing free C only want it to run and throw a result is easy, though.

p/s: I grow up "strong typed", my friends in IT quickly take the trends to everything JS, Dynamic, Functional, etc... but I consider my brain is old I don't want to change :)
 

ralphbsz

Daemon

Thanks: 609
Best answers: 3
Messages: 1,058

#20
For example I want to scan an interger > 0 I will use uinit64_t not int or long because it allow negative.
Use "unsigned" instead. (Actually, theoretically the type is really "unsigned int", but people usually just type "unsigned", the "int" part is automatically added by the compiler.) Like that you get the good restriction that the number is never negative, while still using native data types, and being able to use simple printf and scanf tags, namely "%u".

One of my personal rules is to always use unsigned (instead of int) for loop indices and counters, because they can't go negative. I have colleagues who disagree with me, and say that int is the most natural data type, and should be used. Both sides have arguments, and this becomes a question of taste.
 
Top