SDFileSystem and PokittoDisk not working 100% with latest library

Too bad read-write support might be too much of a challenge for you :wink:

Heh. It’s just how FemtoIDE releases work: crudely add a feature in one release, refine it in the next. That way others don’t have to wait too long for functionality they need, and can find problems and make suggestions before too much effort has been put into it.


I wonder :thinking: why it was working before? (old EmBitz build).


Possible with a bunch of changes to the library the memory requirements have changed. If Xploritto was already pushing the limits of the RAM usage then that’s definitely a possibility.

The answer that first comes to mind is ‘copy elision’.

Depending on how open is being used the compiler might be able to elide the copy.
This is the typical behaviour when a temporary object is passed as an argument to a function.

For example, if you were to do open("/some/file.txt"), the exact process would be:

  • Pass the const char * "/some/file.txt" to the std::string constructor string(const char *), creating a temporary std::string
  • Copy the temporary string to open's path parameter
  • Destroy the temporary string

In which case the compiler can figure out that the last two steps are unnecessary and elide the copy by directly initialising the path parameter with the (unnamed) const char * value "/some/file.txt".
(From C++11 onwards this will potentially result in a move rather than a copy, but the compiler may be able to treat this as an opportunity to do copy elision rather than an outright move.)

However, if std::string had already been constructed and assigned to a variable (i.e. it’s an lvalue rather than a temporary/rvalue) then the copy cannot be elided because a variable is not a temporary - the construction has already occured in a separate statement.
Consequently the string must be copied and the heap must suffer the penalty of the subsequent allocation.


// Copy is elided:

// Copy cannot be avoided:
std::string object { "/some/file.txt" };

Ultimately the best solution is probably to have a const char * overload and to make a const std::string & overload which just forwards the string.c_str(), if you can get away with it.
That way open("/some/file.txt"); doesn’t even create a std::string, it just passes a pointer.

(There’s an even better way than that which involves also passing the length of the string, but I don’t want to drag out the explanations and examples. If you’re interested in that better solution, let me know.)

It doesn’t really need mentioning, but I’ll mention it anyway…

In static bool open(const std::string path); the const is actually meaningless because the std::string is passed by value so the const doesn’t affect the semantics in any way.
(The same is true for volatile, and for return types when returning by value.)

With static bool open(const std::string & path); it is the std::string referred to by the reference that is const, not the reference itself, which is why const makes sense in that case.

1 Like

@Pharap Thank you for the explanation.
My C++ knowledge has improved a lot since i joined this wonderful community.


The problem was not running out of ram, std::string should destroy itself when going out of scope.
The problem was that i forgot the return true of false in that function (broken stack).

Not returning at the end of a function doesn’t break the stack. Return values are passed in registers.

That should be a hard error - i.e. the compiler refuses to compile the code.
Have you got your compiler set to use -fpermissive or something?
Or maybe -std=gnu++11 instead of -std=c++11?

I cant find a better explanation why it crash.



I’ve made this mistake many times and gcc only gives a warning stating no return statement in function returning non-void. -fpermissive hides the warning altogether and -std=gnu++11 vs -std=c++11 doesn’t have any impact.

You can however make it an error by using -Werror=return-type this is similar to the -Werror statement that treats all warnings as errors, but limits it to specific warning types.


GCC did not gave me warning, that’s why i forgot about it.

I was surprised by this, so I had to check what the standard has to say.

Apparently I’ve been spoilt by the sensibleness of Visual Studio’s compiler,
because it’s actually ‘undefined behaviour’, which I would assume means ‘no diagnostic required’.

That seems like a really daft oversight.
Probably too late to get that changed for C++20, but I’m very tempted to propose a change for C++23.

This really ought to be the default behaviour.
I cannot think of a single logical reason for it not having been the default behaviour in the first place.
(“The standard doesn’t require it” is an excuse, not a reason.)

Agreed. Though I myself typically compile with -Wall and then right before a release build add -Werror and clean up any and all warnings. Even the simple comparison between signed and unsigned integers warning when I know the values will never actually exceed the limits of a signed integer. To me it’s more a matter of knowing that there’s a reason why these warnings are there and it’s better to just get them fixed before doing any final testing before a release build. Call it OCD but I just don’t like seeing all those warnings popup left and right and hiding them with -fpermissive is such a “sweep it under the rug” routine that can lead to some serious headaches later on.


I completely agree.
It’s not just OCD, it’s what everyone ought to be doing.
Near enough all warnings are avoidable with well-written code.

I aim to be even stricter than that by cleaning up warnings the moment I’m aware of them.
I think no warnings should be the natural state of things.

Besides which, if you let them build up then getting rid of them always seems like a chore.
Dealing with one or two warnings every few builds always feels much more managable.

If only someone had told the Arduino people that when they first enabled -fpermissive for the Arduino IDE.

I expect the Arduino people thought “beginners hate errors and warnings, so let’s use -fpermissive to bury them all” without stopping to contemplate the implications of that - horrible runtime bugs that only experts have a hope of unpicking.

The worst bit about -fpermissive is that it isn’t even properly documented.
The only explanation of what it does is essentially a handwave that amounts to “it makes some warnings and errors go away”, without specifying which warnings and errors.

Personally I think it should at least be banned on embedded systems where debugging is not exactly easy. It was already a bit of a challenge to find all the little bugs with this issue and that was without the dreaded -fpermissive

Realistically all “undefined” behavior should be an error and any loosely defined compiler flags should simply be removed. The “beginners” that these are being hidden from are expected to eventually be intermediate, or at least knowledgeable, programmers and hiding their mistakes from them is never a good way to learn.

But sadly the Arduino community isn’t always as friendly to beginners like this community is.


:scream: is this true? ‘-fpermissive’ is on by default? As if I needed another reason to diss Arduino…

1 Like

Sadly yes.

I’ve been ranting about it in forums for the last 4-5 years.
(Moreso on the Arduboy forum where it’s an active problem,
but I mention it here from time to time.)

I think Arduino’s goal was more or less “dumb down C++ so hardware people can use it to prototype things without having to actually learn the language”, so they probably aren’t expecting people to improve.

It’s a false economy though, eventually anyone who wants to make any decently sized project will have to learn the language rules or suffer the consequences (i.e. bugs, error messages and the inability to express one’s intentions).

Not only is it true but the scariest part to this real life horror story is…

You really don't want to know
I'm begging you please turn back now before it's too late

You can’t turn it off in any simple way. There’s no settings and no documentation to edit the compile flags. Unless you want to spend money on something like AVR studio it’s also not that simple to build for arduino without it.


There are a number of options to get around it,
ranging from using makefiles to editing the relevant platform.txt,
but the problem with all the alternatives is that if you use one of the alternatives you then have to expect other people to be able the achieve the same environment if they want to compile the code.

That’s awkward enough with a well-experienced audience,
but if you’re making games for a programmable games console where some people find compiling with the IDE hard enough simply because they aren’t used to doing that sort of thing then you can pretty much write off all alternatives.

Granted people who aren’t used to compiling probably aren’t going to want to be editing your project’s code and will probably benefit more from a precompiled executable, but there are intermediate level programmers who might be alright with the IDE but struggle with setting up a more demanding environment.