Jump to content

C++ proposition


Jief_Machak
 Share

823 posts in this topic

Recommended Posts

I think it is more likely the encoding of the file and the string is not properly encoded as it should be or you've done something else as the evidence against your case is that is has worked for every previous version before this with no problem. Another thing is that you keep using L"" and you should be using u"" to ensure that you produce a UTF-16 string literal. Did you check the object file in a hex editor to see if it is correctly encoded? I have a bunch of arduinos, yes.... most don't have very much memory beyond that since they have a 14bit program counter, which is 16KB....

Link to comment
Share on other sites

@Jief_Machak Well, the answer is within your own logs, why does it actually print fine with "%a", which is supposed to be (E)ASCII, stored in a string with supposedly 8 Bits per character? Cyrillic characters are not in (E)ASCII, and in no Unicode encoding I know of are they in the 8 Bit scope... this looks badly borked, sorry.

 

EDIT: Hmm, nvm, it looks like UTF-8 is expected over EASCII, I guess that part is fine

Edited by Download-Fritz
Link to comment
Share on other sites

Yeah, I'm not sure it makes a difference what the encoding for CHAR8 is, he is looking at some log which has a representation that he also gave on his machine so whatever he wrote it was just in the same encoding because it just copied the string. (E)ASCII and UTF-8 overlap and both use 1-255 and null terminator, so unless you look at how the actual object file representation is being held, there's no way to tell. Although I wonder what the log looks like in another OS. I know something that changed is incorrect though...

 

EDIT: Or look at the log in hex editor instead of the object file.

Edited by apianti
Link to comment
Share on other sites

6 hours ago, apianti said:

 

 

EDIT: Or look at the log in hex editor instead of the object file.

I did this. So why I invent such lines

#ifdef _MSC_VER

    Str8 = GetUnicodeChar(Str8, &letter);

    Str8++;

#else

    letter = string; //already UTF16 in clang

#endif

Link to comment
Share on other sites

1 hour ago, Slice said:

I did this. So why I invent such lines

#ifdef _MSC_VER

    Str8 = GetUnicodeChar(Str8, &letter);

    Str8++;

#else

    letter = string; //already UTF16 in clang

#endif

 

That is to convert a UTF-8 string to UTF-16. I was referring to UTF-8 to UTF-8 basically, also the problem is that L"" is not guaranteed to be any size in particular in c++. Why are you incrementing Str8 after the call? GetUnicodeChar() should return the next character. What is string, how is that being coerced into a character? I installed several languages in ubuntu and clang L"" is UTF-32, it was originally UTF-16 before that. CHAR16 should not have changed to wchar_t, it should have stayed the same or been changed to char16_t. There's clearly an issue with encodings going on.

Link to comment
Share on other sites

We are talking about 2 different things here. One is the fact the DBG/MsgLog cannot print string containing UCS-2 > 255, the other is the VS encoding.

For 1) I checked with on old version (before C++) and it doesn't work. The reason is that, internally, Ascii...Print just ignore the second byte of an UCS-2 char.

For 2) I can't believe there is no way to have UTF-16 litteral in VS. I tried also with u"" and the encoded string is malformed.

 

11 hours ago, apianti said:

most don't have very much memory beyond that since they have a 14bit program counter, which is 16KB....

THat's why I'm not using glibc printf...

 

11 hours ago, Download-Fritz said:

why does it actually print fine with "%a", which is supposed to be (E)ASCII

%a will just print all the bytes without analysing them. So if you have utf-8 as input, you got utf-8 as outpout. That was the point of utf-8 : because it's still stored in char, it works with old program even if they don't know about it.

 

11 hours ago, apianti said:

I know something that changed is incorrect though...

Nothing changed, seems it was always like that. The thing is this : ""MsgLog("Test MsgLog ascii=%a ucs-2=%s\n", "a string", L"ascii char in ucs-2 string\n"); works fine (Gcc and Clang), because the UCS-2 string contains only char < 255, so the second byte can be ignored and produce the same result.

2 hours ago, apianti said:

been changed to char16_t

wchar_t and char16_t are equivalent when you compiled with -fshort-wchar (Gcc, Clang). They are still considered as 2 distinct types, and litteral are wchar_t. TO avoid cast, for the time of the refactoring, CHAR16 is defined to wchar_t. Yes doing that is a transition.

On VS, they seems to be always 2 bytes. I didn't find a setting to have wchar_t 4 bytes for UTF32.

Link to comment
Share on other sites

17 hours ago, Jief_Machak said:

We are talking about 2 different things here. One is the fact the DBG/MsgLog cannot print string containing UCS-2 > 255, the other is the VS encoding.

For 1) I checked with on old version (before C++) and it doesn't work. The reason is that, internally, Ascii...Print just ignore the second byte of an UCS-2 char.

For 2) I can't believe there is no way to have UTF-16 litteral in VS. I tried also with u"" and the encoded string is malformed.

 

It is converting to (E)ASCII, so why would it use any characters that aren't (E)ASCII? You are probably having a type mismatch with VS.

 

18 hours ago, Jief_Machak said:

THat's why I'm not using glibc printf...

 

I have never even come close to filling up the memory, also I almost never need to print with an arduino unless I'm debugging so there is no need to waste time doing something that already exists and is probably better optimized.

 

18 hours ago, Jief_Machak said:

%a will just print all the bytes without analysing them. So if you have utf-8 as input, you got utf-8 as outpout. That was the point of utf-8 : because it's still stored in char, it works with old program even if they don't know about it.

 

Yeah that was what I was saying. The log is being printed in (E)ASCII though, so the conversion is only the first 255 characters of UTF-16 because they are the same code page (ISO-8859-Latin-1).

 

18 hours ago, Jief_Machak said:

Nothing changed, seems it was always like that. The thing is this : ""MsgLog("Test MsgLog ascii=%a ucs-2=%s\n", "a string", L"ascii char in ucs-2 string\n"); works fine (Gcc and Clang), because the UCS-2 string contains only char < 255, so the second byte can be ignored and produce the same result.

wchar_t and char16_t are equivalent when you compiled with -fshort-wchar (Gcc, Clang). They are still considered as 2 distinct types, and litteral are wchar_t. TO avoid cast, for the time of the refactoring, CHAR16 is defined to wchar_t. Yes doing that is a transition.

On VS, they seems to be always 2 bytes. I didn't find a setting to have wchar_t 4 bytes for UTF32.

 

In VS, I'm pretty sure that the character types are signed, L"" is UCS-2 and u"" is UTF-16, always, "" is locale dependent. I think you seem to be confusing that the function is converting to (E)ASCII not UTF-8... So of course it is only going to have (E)ASCII characters, if you change this then some things that rely on only (E)ASCII characters may break. I was talking about ubuntu for UTF-32. I am confused why you think the compiler does not make correct strings from u"", it must make a UTF-16 string, in any c++ compiler because that is what the standard says it is. So you should be using u"" and char16_t to properly ensure that it is UTF-16. Relying on a compiler specific mechanism to enforce something is bad design, especially when there is a language mechanism that accomplished the desired outcome, which I think you see as it is causing problems.

13 hours ago, Slice said:

I don't remember details. It was a year ago.

 

I think that is mistake, no?

Link to comment
Share on other sites

2 hours ago, apianti said:

 

 

I think that is mistake, no?

It was for testing drawSVGtext(TextBufferXY, 0, 0, 3, L"Clover Кловер", 1);

and I got the result.

First I compiled the project in VS and then open it in hex editor and search the string.

Link to comment
Share on other sites

Parsing SVG I have the follow procedure

  for (i = 0; i < len; i++) {

    CHAR16 letter = 0;

    s = GetUnicodeChar(s, &letter);

    if (!letter) {

      break;

    }

    x = addLetter(p, letter, x, y, scale, p->text->fontColor);

  }

and it works for russian text in svg file.

Снимок экрана 2020-03-15 в 11.03.57.png

Link to comment
Share on other sites

4 hours ago, apianti said:

 

22 hours ago, Jief_Machak said:

THat's why I'm not using glibc printf...

 

I have never even come close to filling up the memory, also I almost never need to print with an arduino unless I'm debugging so there is no need to waste time doing something that already exists and is probably better optimized.

You are always reluctant to everything I'm doing, be able to propose that doesn't work : glibc printf doesn't fit in an Arduino. PLUS : glibc printf can't print FlashString, mine can with a %F I've created. I even modify Gcc to give a proper warning. There is no need to have a buffer in advance to contains the all result there is a callback mechanism to print on a serial, for example. All that in less that 500 bytes of code. PLUS, you can deactivate features you don't need (float, long int, etc.).

Defo not the same as glibc...

 

It's interesting the way you like to contradict people. For example, you said that having a small printf is not worth the effort over glibc printf because YOU have plenty of memory in your Arduino, without thinking it's not the case for everyone.

Link to comment
Share on other sites

 

22 hours ago, Slice said:

It was for testing drawSVGtext(TextBufferXY, 0, 0, 3, L"Clover Кловер", 1);

and I got the result.

First I compiled the project in VS and then open it in hex editor and search the string.

 

I was referring to the previous code you posted, I was wondering why you incremented the string pointer after you already went to the next character, which may not have even been only one unit. You don't seem to be doing it there, unless you are saying you changed it. But the thing is that the string should be in UTF-16 already so it shouldn't need converted from UTF-8. The usage of L"" is not the same in c++ as in c, there are many things that are very different and apparently that can't be grasped - that there were going to be many issues with trivial things by switching to c++...

 

20 hours ago, Jief_Machak said:

You are always reluctant to everything I'm doing, be able to propose that doesn't work : glibc printf doesn't fit in an Arduino. PLUS : glibc printf can't print FlashString, mine can with a %F I've created. I even modify Gcc to give a proper warning. There is no need to have a buffer in advance to contains the all result there is a callback mechanism to print on a serial, for example. All that in less that 500 bytes of code. PLUS, you can deactivate features you don't need (float, long int, etc.).

Defo not the same as glibc...

 

It's interesting the way you like to contradict people. For example, you said that having a small printf is not worth the effort over glibc printf because YOU have plenty of memory in your Arduino, without thinking it's not the case for everyone.

 

Actually, I said you weren't going to write a better, more optimized printf. How could you possibly be using everything in glibc? The arduino IDE uses it...? How could it not work well. What arduino do you have that doesn't have 16Kb of memory?Every arduino has at least 32kB program memory, so..... huh? I am well aware of hardware level access for the microcontroller, what do you think the implementation for it does? Something radically different than you? Also why could they not optimize the code in assembly if it would be better? You can spout crazy sh it all you want but I am able to perform the scientific method and I know what cost-benefit analysis is. And I am reluctant to awful changes that are accomplishing nothing, so yeah, if that's what you're doing I'm going to oppose it. So far you have yet to convince me these changes were needed or even helpful or useful.

Link to comment
Share on other sites

2 minutes ago, apianti said:

scientific method

Ah, I was missing your "scientific method" arguments.

 

3 minutes ago, apianti said:

I'm going to oppose it

That's fine, we know that you are going to oppose everything.  :hysterical:

 

Ok, so I'm waiting for scientifically better arduino project that uses glibc printf. Send me the sources and we'll compare. :thumbsup_anim: 

Link to comment
Share on other sites

14 minutes ago, Jief_Machak said:

Ah, I was missing your "scientific method" arguments.

 

That's fine, we know that you are going to oppose everything.  :hysterical:

 

Ok, so I'm waiting for scientifically better arduino project that uses glibc printf. Send me the sources and we'll compare. :thumbsup_anim: 

 

The arduino IDE uses glibc, I just said that. WHAT ARE YOU TALKING ABOUT? I have literally given evidence against every single one of your claims and you have yet to provide any evidence to the contrary, except your flawed string test. You think very highly of yourself, I do not really care about any of the things you think are making you look so great - they are not. What can we can compare? You really think that you are going to best me at writing code or research? You're quite delusional if you think I am even remotely trying when I come here most of the time. This is my escape from CSEE research... lol. I'm definitely done with this anyway... I have an idea for something else I'm going to work on.

Link to comment
Share on other sites

On 3/16/2020 at 8:41 AM, apianti said:

things you think are making you look so great

Interesting that in your mind, participating in a open source project and looking good is connected !

Would not cross my mind, as nobody knows me...

On 3/16/2020 at 8:41 AM, apianti said:

best me

Not besting you. Just interested in what you can produce if it's as great as you say. That's all. If it is, I would take it instead of what I did.

Link to comment
Share on other sites

On 3/20/2020 at 11:57 AM, Jief_Machak said:

Interesting that in your mind, participating in a open source project and looking good is connected !

Would not cross my mind, as nobody knows me...

 

I was referring to your idiotic statements about pretty much everything... Notice how I never said that? Because I say what the f u c k I actually mean unless saying the opposite will provoke a better outcome.

 

On 3/20/2020 at 11:57 AM, Jief_Machak said:

Not besting you. Just interested in what you can produce if it's as great as you say. That's all. If it is, I would take it instead of what I did.

 

I don't care about that at all, I could write it in AVR assembler... You're a gnat to me, lol. And I probably wrote the code or directly had a hand in developing more than a third (at minimum) of this project...? The only person who has done more is Slice. Not to mention the other things that have spawned from my conversations/ideas, or helping others with projects related to this. I told you I have another idea, I am currently extremely busy with working, classes, research, and teaching, so I am working on it slowly. It will be ready when it's ready. I have other more important stuff to worry about currently than your ego...

Link to comment
Share on other sites

On 3/16/2020 at 8:41 AM, apianti said:

things you think are making you look so great

 

4 hours ago, apianti said:

Notice how I never said that

Sorry, you're right. You didn't talk about "looking good", you talked about "looking great". But you did, not me. You are not the one that think participating anonymously to an open source project can make you look "great". If I were a shrink, I would find that very very interesting !

 

4 hours ago, apianti said:

more than a third (at minimum)

 

I have other more important stuff to worry about currently than your ego...

Oh, that's interesting. After "looking great", we are at the "amount of lines of code" competition. Even more interesting knowing that your commits are totalising 400 lines of code since 2012-09-09 on Clover. 

Yeah, you sure do have more important stuff to worry about my ego : yours. It must take all of your time :hysterical:.

 

It was nice playing with you, but every good thing has an end. I'm offering the final word of this interesting conversation.

Nice interesting meeting you.
Good bye.

 

Link to comment
Share on other sites

@Jief_Machak Please don't forget his great ideas that helped so many projects - "do it like Apple" really was a huge technical leap for e.g. prelinking kext injection. I don't know where we would be today without this kind of in-depth analysis.

400 LoC just means you haven't donated enough yet, but not to worry, the button is still there and Clover 3.0 might just get more than some incomplete libraries some day.

Link to comment
Share on other sites

@Download-Fritz I wouldn't judge him. Never said he never got any idea or did nothing. I hope people that have such a big ego can based it on at least some skill. At first, it was like "all your ideas are {censored}" etc. "I have scientifically proved etc.". So it was 100% non-constructive and funny. But then he escalated to "I want to look great" and "ego something I don't remember". Well known rhetorical technique when you no longer have arguments = attack the guy instead of what he says. Also well known that it's a tendency to accuse other of what you already do, because that's the only thing you can imagine. He's probably too young to have realised that all what you say about other tells more about you than the others. He also tried the "I did a lot to the project". Which is also another well known rhetorical technic = I did a lot so I'm right in the current conversation. So he has his own management problems but that doesn't mean he is stupid or anything like that. Defo not. He just don't know how to balance pros and cons, and he doesn't know how to say it nice. A lot of intelligent people are not nice. That just happens.

 

So I will ignore any further non-constructive argument he'll say, but I'll still listen to all the rest. Ex : although I can help himself but saying "you know nothing about C++" (100% non-constructive) when he gave the idea of litteral operators, (not have thought about litteral operators = knowing NOTHING ??? You got the basic logical error here, I'm sure), I still kept the idea (constructive) and will implement it soon. It's just a small example, but if there is bigger ones, I'll take them.

 

Story short : I will never that all his ideas are {censored}.

Link to comment
Share on other sites

10 minutes ago, Download-Fritz said:

@Jief_Machak I think you missed the sarcasm, "do it like Apple" is not quite an in-depth analysis. :)

yes I did miss it :hysterical::thumbsup_anim:

4 hours ago, Download-Fritz said:

400 LoC just means you haven't donated enough yet, but not to worry, the button is still there and Clover 3.0 might just get more than some incomplete libraries some day.

This sentence takes a new meaning... :hammer:

Link to comment
Share on other sites

 

22 hours ago, Jief_Machak said:

 

Sorry, you're right. You didn't talk about "looking good", you talked about "looking great". But you did, not me. You are not the one that think participating anonymously to an open source project can make you look "great". If I were a shrink, I would find that very very interesting !

 

Oh, that's interesting. After "looking great", we are at the "amount of lines of code" competition. Even more interesting knowing that your commits are totalising 400 lines of code since 2012-09-09 on Clover. 

Yeah, you sure do have more important stuff to worry about my ego : yours. It must take all of your time :hysterical:.

 

It was nice playing with you, but every good thing has an end. I'm offering the final word of this interesting conversation.

Nice interesting meeting you.
Good bye.

 

 

Yeah, I wasn't saying that to look like anything, it is a fact. You said "just interested in what you can produce," so I told you, moron.

 

EDIT: Not sure where you came up with that number since here is one random commit that has more than that and I didn't even look hard, I know there are better ones before this, https://sourceforge.net/p/cloverefiboot/code/2283/

 

15 hours ago, Download-Fritz said:

@Jief_Machak Please don't forget his great ideas that helped so many projects - "do it like Apple" really was a huge technical leap for e.g. prelinking kext injection. I don't know where we would be today without this kind of in-depth analysis.

400 LoC just means you haven't donated enough yet, but not to worry, the button is still there and Clover 3.0 might just get more than some incomplete libraries some day.

 

DF, shut the f u c k up you bigoted POS. I have no reason to write clover 3.0 anymore because many of the things I said went into OpenCore. I also can't stand any of you a s s h o l e s because none of you believe in science at all. Just like when I said that the shutdown panic on 300 series chipsets was caused by a missing memory region. I was totally wrong about that, RIGHT? But it was of course the SMI, except the fix was adding a missing memory region wasn't it????

 

EDIT: Oh I forgot, 400 LoC... or 65K, and i removed a majority of the code to put it up because i wasn't sure of the legality, lol https://github.com/apianti/Clover/graphs/contributors

 

10 hours ago, Jief_Machak said:

@Download-Fritz I wouldn't judge him. Never said he never got any idea or did nothing. I hope people that have such a big ego can based it on at least some skill. At first, it was like "all your ideas are {censored}" etc. "I have scientifically proved etc.". So it was 100% non-constructive and funny. But then he escalated to "I want to look great" and "ego something I don't remember". Well known rhetorical technique when you no longer have arguments = attack the guy instead of what he says. Also well known that it's a tendency to accuse other of what you already do, because that's the only thing you can imagine. He's probably too young to have realised that all what you say about other tells more about you than the others. He also tried the "I did a lot to the project". Which is also another well known rhetorical technic = I did a lot so I'm right in the current conversation. So he has his own management problems but that doesn't mean he is stupid or anything like that. Defo not. He just don't know how to balance pros and cons, and he doesn't know how to say it nice. A lot of intelligent people are not nice. That just happens.

 

So I will ignore any further non-constructive argument he'll say, but I'll still listen to all the rest. Ex : although I can help himself but saying "you know nothing about C++" (100% non-constructive) when he gave the idea of litteral operators, (not have thought about litteral operators = knowing NOTHING ??? You got the basic logical error here, I'm sure), I still kept the idea (constructive) and will implement it soon. It's just a small example, but if there is bigger ones, I'll take them.

 

Story short : I will never that all his ideas are {censored}.

 

So, basically that I had to correct almost everything you said about C++, means you do know? Maybe you should go back through the thread because there are many more examples. Judge me all you f u ck ing want, I don't give a s h i t.

 

10 hours ago, Download-Fritz said:

@Jief_Machak I think you missed the sarcasm, "do it like Apple" is not quite an in-depth analysis. :)

 

Once again, f u c k off...

 

10 hours ago, Jief_Machak said:

yes I did miss it :hysterical::thumbsup_anim:

This sentence takes a new meaning... :hammer:

 

That you are oblivious? Also, I have signatures turned off and haven't really been here so I didn't realize it still said that. I have removed it because I don't care about any of this.

 

Gotta really say that you are the most stable geniuses!

 

Edited by apianti
  • Like 2
Link to comment
Share on other sites

7 hours ago, apianti said:

I also can't stand any of you a s s h o l e s because none of you believe in science at all. Just like when I said that the shutdown panic on 300 series chipsets was caused by a missing memory region. I was totally wrong about that, RIGHT? But it was of course the SMI, except the fix was adding a missing memory region wasn't it????

Oh damn sorry, I stupidly thought your science said the issue was that our shims were being unmapped... but yeah, if we get more vague it works, "something about memory". :)

Link to comment
Share on other sites

Convenient that you left out the following posts where I backtraced and proved that it was indeed a missing memory region. So I was wrong about which one? Lol you are literally the dumbest person I've ever known. You care so much about looking like you're right and better than me, that you constantly move the goal posts and whataboutism everything. Everyone in that entire thread was telling me I was wrong and it was SMI, it literally had nothing to do with SMI. It had to do with a missing memory region, it doesn't really matter which one. The SMI calls weren't even on the call stack, lol. And yes, you obviously can only stupidly think....

Link to comment
Share on other sites

@apianti Look, I'm sorry for you being obviously very hurt over that, but your version of reality is unfortunately one nobody manages to share. You provided a specific theory, you were told why that specific theory was garbage, and your specific theory was garbage indeed. "Look, it was something in a completely different department, but similar!" Uh, yes, my congratulations. vit's blind guess was a bit worse than your blind guess (cause unfortunately weighs significantly more than symptom), I hope that gives you the satisfaction you've been seeking the past few days with your heroic self-illustrations. Maybe someone will see your true genius when you actually manage to solve a problem or write code over just being a keyboard warrior. But don't waste your time here, CSEE is waiting. :)

 

 

Link to comment
Share on other sites

 Share

×
×
  • Create New...