- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
And a String is generally just a wrapper of a char array. I don't see the distinction between wrapping the character array operations in a class and writing them over and over again. At a hardware level a char array is a series of bytes. At a hardware level a String is a series of bytes. the hardware doesn't give a rat's ass what that series of bytes represents in your program.
I fail to see why that means people should not use Strings and avoid char arrays in C++.
Admin
It doesn't, and Otto didn't say it did.
You must admit there is value in learning to manipulate character arrays, even if the value is limited to "someday you might need to code a lower level program w/o C++"
If you're prepared to put a design constraint on yourself to the effect that "I will never need to use character arrays", by all means stick to the string library. There's a good argument for this... the string library is easier, and you gain time to learn other things.
Admin
You're right, he wrote:
This is what I am questioning, I don't know how we got into this whole "you might need C one day" discussion. Hardware doesn't need to "directly implement String constructs" in order for a language to allow them. What hardware does have a direct implementation for Strings? This implies that you might not be able to use Strings when the target platform's hardware doesn't support them. It also implies that common hardware does have direct support of character arrays. I'm not saying that hardware could not do this but it's clearly not a requirement in order to use either construct.
Admin
I never said there wasn't value to it. As far as I'm concerned, arrays is arrays. Bytes is bytes. I solve problems as needed.
Admin
<FONT face="Courier New" size=2>the kind of hardware that has native support for times new roman, courier, lucidia console, comic sans, and helvetica.</FONT>
Admin
I think Otto just momentarily confused hardware with libraries. On the microcontroller I've worked with (Atmel ATMega16), you don't get the C standard library. And you don't have the tools (MMU, or in some cases, stack space) to build a String abstraction. So you have to work with character arrays directly.
Admin
OK, I'll take your word for it. But, from my perspective, a basic String abstraction can be a set of subroutines that work on character arrays. Are you saying you can't define subroutines? The memory requirements for this type of abstraction are exactly the same as for a character array.
Admin
OK, but I see absolutly no need for the innermost tests -- if (t!=s && *t != *s) -- if it were me I'd do the copy unconditionally. The tests waste cycles figuring out if doing the copy would be the equivalent of doing nothing.
I'd do it this way:
void RemoveColonsFromString(char *s)
{
char *t;
char c;
t = s;
do {
c = *s++;
if (c != ':')
{
*t++ = c;
}
} while (c != '\0');
return;
}
To me, that's the simplest, clearest way to do it.
Admin
BTW I just now thought of an answer to this. A proper string implementation should be able to deal with the full range of Unicode characters, which means the char type is useless. You need 2 or even 4 bytes to represent one character, so you'll define a type for that and work with arrays of that type.
Admin
Some microcontrollers have no stack at all. And some have such a small stack space that you're limited to only going 3 or 4 subroutines deep. (Although you might be able hack something together with a preprocessor - I haven't worked with such a chip myself.)
But the point is, the subroutines you're creating aren't usually general enough for me to consider them an abstraction in the same way as the C standard library or a String class.
Admin
I agree, though you both seem to be a bit too focused on C/C++. I got my degree (in 2002) without ever formally learning C and without writing more than a hundred lines of it. It was Java right from the beginning and for most classes teaching abstract concepts. However, the mandatory classes also included assembler and even microcode.
While I've never actually used either outside those classes, don't expect to do so in the future, and am glad about that, I'm certain that these classes were very important because they (in combination with a few other "useless" ones) gave me a complete (if non-detailed) understanding of how a computer fundamentally works on all levels, from the logic gates on the CPU IC up to the user applications.
I actually think this has a measurable value for my work, even though that is situated pretty much exclusively on the highest of those levels (distributed Java apps), because it made me view the computer as a fundamentally deterministic and understandable (if complex) machine and given me the confidence to know that I can always and completely control that machine if I put my mind to it. There is nothing magic or scary about it. Any bug can be found and fixed if you dig deep enough. Any piece of code must have a reason to exist (even though that reason may be to satisfy stupid quirks of some routine written by a programmer who does believe in magic).
Admin
I feel like you are assuming that I have never programmed a microcontroller. That is not the case. I have worked on many different low-level technologies from building circuits with loose trasistors to writng assembly interpreters in machine code.
I can't recall exactly, but I seem to remember building subroutines with no hardware stack. Anyway, I can imagine how I would do it if were necessary.
My point is that the code you write whether it be in C, C++ or Java, all compiles down to the same types of instructions. Saying my hardware doesn't support Strings doesn't make sense.
Admin
I love it. It's the perfect marketing drivel litmus test. Whenever it is used as a verb, you know you're reading/hearing content-free strings of words that can be safely ignored.
Admin
I don't think Java is a bad language to learn to program in. I actually think it's a good language because it forces a level of stucture on the programmer and frees that programmer from a lot of confusing an arcane syntax.
But a CS degree is more than just a programming degree. I think they need to create a more business/computer mixed curriculum for people who aren't very interested in the fundamentals. I think part of the problem is that this would re-marginalize CS departments and the heads don't want that.
Admin
I think leverage is actually a pretty good verb. I don't know of a better word for it. It's probably overused but don't throw the baby out with the bathwater. Verbing is a fact of life these days. Best to just agreement it.
Admin
Either you're a true master of obfuscation, or you've neglected to actually check for the ':' anywhere in there.
Admin
I assure you that this correctly removes colons from Mac addresses. (Barring minor bugs as I have not actually tested it)
Thank you. I don't know that I'm a master, but it this is a good start on the path.
Note the function is RemoveColonsFromMac? That should be your clue. My version will not work for anything other than well formed mac addresses. (Such as the origonal MacAddrToChar would produce)
I'll give it away tomorrow, but I'll give you overnight to sleep on the trick I'm pulling.
Admin
Heh. Yes, I see how it works. :)
Admin
I've used Context Free Grammars but not Content Free [:'(]
Admin
An OO language with no public or private identifiers? What sick language is this?
Admin
Smalltalk? Python? Javascript?
Admin
Spoken like someone who learned to program from "Teach yourself VB in 21 days"