Jump to content

C++ proposition


Jief_Machak
 Share

823 posts in this topic

Recommended Posts

29 minutes ago, Jief_Machak said:

Could you have a look at my last commit (one line). There was a test always true. I don't know if it should replaced.

Yes, you are right, it was a bug.

 

Next bug with VS I want to catch is unicode text assumed by VS is UTF8 instead of UTF16.

So I did

VOID REFIT_MENU_SCREEN::AddMenuInfo(CONST CHAR16 *Line)
{
  REFIT_INFO_DIALOG *InputBootArgs;

//  InputBootArgs = (__typeof__(InputBootArgs))AllocateZeroPool(sizeof(REFIT_INPUT_DIALOG));
  InputBootArgs = new REFIT_INFO_DIALOG;
#ifdef _MSC_VER
  const CHAR8 *Str8 = (const CHAR8*)Line;
  InputBootArgs->Title.takeValueFrom(Str8);
#else
  InputBootArgs->Title.takeValueFrom(Line);
#endif
//  InputBootArgs->Tag = TAG_INFO;
//  InputBootArgs->Item = NULL;
  InputBootArgs->AtClick = ActionLight;
  AddMenuEntry(InputBootArgs, true);
}

Same procedure I did in VectorGraphics and it works. But AboutMenu and HelpMenu does not show unicode.

Link to comment
Share on other sites

Seems easier to configure the project to be UTF16 instead of UTF8 ?

 

The fact you can do :

const CHAR8 *Str8 = (const CHAR8*)Line;

 

and it's still working is wrong. Having UTF8 in CHAR16* is wrong.

I'm pretty sure there is an option somewhere to say to VS that L"litteral" is UTF16, not UTF8. 

Link to comment
Share on other sites

Has anyone ever dreamt of compiling Clover from scratch in less than 10 seconds ? Which also almost instant is you only modify few files.

That is now possible with the Xcode project located in Xcode/CloverX64. NOTE : you need to have compiled with usual system (ebuild.sh) at least once before using this new Xcode project.

Open Xcode/CloverX64/CloverX64.xcodeproj, click build, have a look in Xcode/CloverX64/DerivedData/clang-slink, efi is there.

Link to comment
Share on other sites

2 hours ago, Slice said:

Same procedure I did in VectorGraphics and it works. But AboutMenu and HelpMenu does not show unicode.

I don't see the problem. I've comment out the #ifdef _MSC_VER is VectorGraphics and I've got the same result. What's the problem ? Can you send me a screenshot of what's wrong ?

 

I think the string are properly encoded as UTF16 with VS2017 because if not, it would NOT work with non-svg theme.

Edited by Jief_Machak
Link to comment
Share on other sites

3 minutes ago, Slice said:

The difference is between Clang and VS compilation. No matter of Clover revision, even 4000.

Not sure what you mean...

 

By the way, I don't have any log when I use a VS2017 compiled version of CloverX64.efi. any idea ? 

Link to comment
Share on other sites

I committed a draft template to make Theme as new class.

For example instead of

if (GlobalConfig.BootCampStyle) {

we will write

if (Theme.BootCampStyle) {

etc.

Link to comment
Share on other sites

The fact that you need to skip char every char proves that it's not UTF-8.

    Str8 = GetUnicodeChar(Str8, &letter);

    Str8++;

I'm trying to reproduce. It's working in English (ascii char) so I have to switch language. It's russian, right ?

Link to comment
Share on other sites

3 minutes ago, Jief_Machak said:

The fact that you need to skip char every char proves that it's not UTF-8.

    Str8 = GetUnicodeChar(Str8, &letter);

    Str8++;

I'm trying to reproduce. It's working in English (ascii char) so I have to switch language. It's russian, right ?

Yes, russian. You may test in french which also have some non-acsii letters if write them into help menu. I am not sure if a font.png contains such characters.

For example font folder contains device.png which contains extended latin.

Link to comment
Share on other sites

Looking at the display bug, I realise that DBG("%s\n", string) with string being a CHAR16* doesn't work. I traced into the edk sources and it's just can't. AsciiVSPrint output in utf8 and there is no conversion.

To workaround that, I've committed a XString (no W). To convert a Lstring to utf8, Just do XString(Lstring).c ( .c is to get a pointer to give to DBG or other).

 

Link to comment
Share on other sites

I should have known straight away. Few days ago, when I wrote XStringW_test, I notice that VS doesn't encore properly UTF16 litterals. Thanks M$

This is the same problem. And when I say not properly, it's really NOT proper. It's not utf8 (although it'll be compatible as long as you only have char < 255).

If it's comiler switch or env setup, please Microsoft expert help us.

 

Long story short : there is currently NO WAY to have UTF16 litterals works if they have char > 255. It looks like its only the case for the hard coded help menu.

Solution : using XStringW !!! (I knew that would be useful, thanks @apianti for the advice of using C++). I switched AddMenuInfo parameter from CHAR16* to char*, I removed the L in front of litteral. In AddMenuInfo (menu.cpp:1342), I just have to do "InputBootArgs->Title.takeValueFrom(Line);" <- line being now an utf8 string.

 

I think we should anyway convert all the litterals in Clover to utf8.

 

Link to comment
Share on other sites

Yeah, crazy.... I mean I wonder why it just worked before? Probably because you can specify the meaning of wchar_t in c because it is not an actual part of the language but part of the standard library... The c++ standard says wchar_t is a builtin type and specifies that it is locale dependent. There is no way to change this because that is the behavior that is specified. Most likely wchar_t is using the current MS code page for your region and char is using ASCII or the Latin-1 code page. They could be the same size, the specified difference is that char can use multiple units to encode a character, wchar_t uses one unit per character. I'm guessing on a POSIX compliant system these will probably be UTF-8 and UCS-2 by default. My windows 10 is Latin-1 (1252) and UCS-2 for en-US, ubuntu appears to be UTF-8 and UCS-2 (could be UTF-16, I didn't check the surrogates). You can make a UTF-16 literal by using the u"" prefix instead of the L"" prefix in c++.... Why do you need a class? Your solution always seems to be to add another class, which is shifting more and more issues from static to dynamic overhead. Why are you even converting things to not use UTF-16 anymore?

 

EDIT: Actually it only says that wide means able to represent every distinct code for the largest character set of the locale and narrow must be one byte units.

 

EDIT2: I actually just got a new computer and didn't have any other languages installed. I installed some other language packs that I use and now wchar_t is UTF-32....?

Edited by apianti
Link to comment
Share on other sites

On 3/11/2020 at 6:01 PM, apianti said:

 

Damn, you really do know nothing about c++, huh? The literal operator (https://en.cppreference.com/w/cpp/language/user_literal) is used to convert anything in quotes to another type.

Thanks for this lesson. I really not sure if we can use it but I see an interesting possibility

100_MHz == 100000_kHz

yes?

Link to comment
Share on other sites

50 minutes ago, Slice said:

EDK2 libraries deal with CHAR16* and we will use them. For example file names.

 

UEFI specifies usage of UCS-2 (UTF-16 without surrogates), which is why it uses CHAR16 *. He has set CHAR16 equal to wchar_t for c++, to match the L"". But by changing to c++ you should now use u"" to create UTF-16 strings because that is what is ensured to match 16bit encoding units. wchar_t can be changed for c to something specific because it is defined in the standard library, it cannot for c++ because it is specified as part of the language. So the L"" strings in c could be changed to be 16bit encoding, they cannot be in c++.

 

13 minutes ago, Slice said:

Thanks for this lesson. I really not sure if we can use it but I see an interesting possibility

100_MHz == 100000_kHz

yes?

 

Yeah absolutely you can. I think it would be something like this:

size_t
operator "" _kHz (
  size_t kHz
) {
  return kHz * 1000;
}

size_t
operator "" _MHz (
  size_t MHz
) {
  return MHz * 10000000;
}

 

Link to comment
Share on other sites

 Share

×
×
  • Create New...