Swift, C & Assembly Language

I'm looking to get into programming. Since I use MacOS, logically, I should start with Swift.
Swift is a new high level language much like Rust, etc. But the base of the whole OS is written in C and if you want your program to be quick, I was told that C is the best way; hence why it is used for making drivers. If you need faster, some suggested Assembly is the fastest; hence why it is used for embedded systems.

I'm thinking which is the better choice... if not Swift, which of the other two... Is it worth learning Assembly?
 
On a Mac, Objective-C which is C including Mac add-ons (in a way). C is high-level assembly in a sense and, with today's compilers, can be very efficient at creating assembly. The only time I would do assembly nowadays is when a specific routine needs optimization. Today's microprocessors are far too complicated to take advantage of all the speed up techniques by writing assembly yourself.

This coming from a guy who had to be dragged kicking and screaming in 1987 from my assembler to learn this new C thing and who used to write microcode for bit-slice machines using toggle switches.
 
Beginners should concentrate on the higher level application frameworks first...Also, learning the syntax of a language is only half of it. To fully appreciate the nuances of programming, you want to develop an understanding of comp-sci principles like data structures, design patterns, etc. Getting under the hood should be deferred until you have some application level experience....IMHO
 
On a Mac, Objective-C which is C including Mac add-ons (in a way). C is high-level assembly in a sense and, with today's compilers, can be very efficient at creating assembly. The only time I would do a whole program in assembly nowadays is when a specific routine needs optimization. Today's microprocessors are far too complicated to take advantage of all the speed up techniques.

This coming from a guy who had to be dragged kicking and screaming in 1987 from my assembler to learn this new C thing and who used to write microcode for bit-slice machines using toggle switches.

Awesome. I started in BASIC+ and LSI11 assembler in 1982. I cringe when I have to look at or modify C code these days. gimme C++14 or death! You probably also remember real computers...the ones with bit switch panels on the front of them.
 
tempest766 I started life as an electronic engineer. The first computers I designed were built with TTL-logic and the most integrated ones used the 74181 ALU.

Did some TTL level interfacing of early PICs in the physics lab in college. Had to interface the GeigerMueller tubes to cough cough C64 machines to track radioactive decay for isotope percentage experiments. To this day I still end up fighting with the occasional software guy who thinks TTL logic is "compatible" with RS232 signalling just because "some" serial cards would trigger a mark on a TTL drop to ground. Bad engineer! No cookie!
 
Most programmers tell me that learning Assembly language isn't worth the time these days as extremely fast processing speeds can be had using C and a good C compiler.
In terms of object oriented programming, C++ is the king. But on the Mac, its Obj-C. They tell me that if I need the same level of C++, I need to look at Swift.
Actually the way Apple talks about Swift, its like they're hailing it as the new C. Highly doubt it though, or they would have completely re-written their OS in Swift.
 
Languages and tools that are not platform agnostic, really rub me the wrong way. Part of the vendor business model in the 21st century is to coerce developers into using a special tool or language to program the vendor's hardware. That's kind of why I've stayed true to my NIX roots over the years. Applications level code written in C or C++ can generally be ported to ANY NIX type OS with minimal effort. I can proudly say that I never learned Swift, COCO, C#/.NET, Android, etc.
 
Don't you throw away assembly too soon. You need to know the assembler and read what your compiler does for you, get a feeling for the process. I am currently being paid for ripping out the design pattern layer cake in an embedded system that consists of too many layers of signalling and much too many templates. The authors had a good idea of the design patterns, but a poor understanding what each line/statement would actually do. The result is a complete disaster, with it:s own ProblemFactory().
 
From what I've read in various places, Apple has made Swift open-source. This essentially means Swift as a programming language isn't just for MacOS alone, although being created by Apple, most of the current Swift developers will be ones focusing on Mac.
 
Don't you throw away assembly too soon. You need to know the assembler and read what your compiler does for you, get a feeling for the process. I am currently being paid for ripping out the design pattern layer cake in an embedded system that consists of too many layers of signalling and much too many templates. The authors had a good idea of the design patterns, but a poor understanding what each line/statement would actually do. The result is a complete disaster, with it:s own ProblemFactory().

I can very much appreciate what you're getting at...I find myself in similar situations sometimes. I think the issue here is that the OP is looking to "get into programming", thus at a level so far removed from embedded systems that it's apples and oranges. Admittedly, I try to use C++ contructs in embedded systems when possible: not mission critical, not concerned about static vs free store allocation and heap fragmentation, etc. because it won't be running 24x7 for months on end. Without seeing the specific systems you are tackling I'm only guessing but am curious about your "issues" with the templates. Obviously the original authors did no do any requisite simulation of their design on a workstation system before running it on the target hardware?
 
Oh, they did... and it worked on their workstation. And then some new guys (including yours truly) came in and started asking questions about overhead, the role of JSON in embedded systems and design as a whole. They also started to torment the code base with things like lint, clang-analyze and valgrind. Oh, and -Wall -Werror. And when you have to explain to some "senior architect" how to use valgrind and that a virtual call only needs these 3 instructions... and how much bloat their template overuse generated, you start to wonder what it is that was missing in his experience/training. What is missing is the feeling for what the machine is actually doing with your code.
 
I am not a programmer but I am willing to learn for hobby (just lacking some spare time), but I usually take my hobbies very seriously even more than I should.

I did basically the same question as you on IRC some time ago and got two 'best' routes from some developers:

  • Start with C, but creating a solid base and learning how computers actually work.
HERE, Book, Book, Book, Book.

  • Start with some interpreted high-level programming language (perl, python) to get the basics and start doing stuff, then go down.
Other interesting material: HERE, HERE.

Another advise was: never start with the brainfuck C++. Also, I was told by many that lang/rust can be very frustrating to learn programming with, but I do not remember why.

Another language I would consider, not necessary to be the first to learn but the second, would be Ada.

EDIT: I am willing to first properly learn Bournie Shell, and later (or more or less at same time) learn C.

EDIT.2: a free BOOKLET from AdaCore.

EDIT.3: Some Ada MATERIAL.

Hope will be helpful. :)
 
I am recently toying around with lazarus/freepascal. The thing compiles itself in about 3 seconds, and comes with a big class library. Generated code is pretty good and it keeps the old turbo feeling. Maybe something to look at.
 
Back
Top