The software standards for small-UAVs certification are not defined yet. It is not just FAA but ICAO, JAA, EASA, and CAA for different set of countries, etc.
This is true, however there is already considerable work being done on this topic by various aviation authorities. See this link for the technical opinion.
For Coverity/MISRA-C/DO-178 compliance, does not seem to be the followed
approach for small UAVs. It is too binding and/or requires other metrics
not yet defined. Paparazzi shows slightly better when it comes to
certain metrics, although if different projects have used similar
settings for Static analysis' is debatable and those results may give a
different impression than what might be the case.
There are some misconceptions here. DO-178 is a Systems Engineering process for safety critical airborne software, whereas MISRA-C is a set of guidelines used in order to improve software safety and reliability. DO-178, as previously stated, dictates (based on the development assurance level) what your software development life-cycle should look like, and is only really effective if you have very large teams working on (very) complex projects. These days a more effective approach seems to be a combination of Agile development techniques and DO-178 (traditional waterfall development cycles are not really that effective in my opinion). DO-178 basically stipulates the WHAT, not the HOW i.e. it says you should have a coding standard, but does not dictate the coding standard (you can choose whatever you prefer). It also says you should have a configuration management plan/system, however HOW you do it, is up to you. I have never seen two companies follow exactly the same process when doing DO-178 either, everyone uses some sort of custom variation on the theme.
MISRA-C on the other hand is simply a set of rules you can use with a static analysis tool. These rules can easily provide false positives, therefore the rule selection is very important to get right. Just blindly running MISRA on code never makes sense, you need to investigate every warning/error triggered to make sure if you are dealing with false positives or not.
For our (small UAV) industry, it is a good indicator to keep the project on track - but does not in any way
indicate a future failure in the system or of great significance (to
both PX4, Paparazzi). Both projects have delivered excellent long-term
error-free MAV-missions and architectural design. For efforts taken I
believe, looks like both projects will have satisfactory software
compliance possibility in future.
This statement is not entirely true. Let's say you have a certain path through the code that only executes when a certain sequence of events occur (there are many examples of this) - most failure handling modes are examples of this. Test flying will not necessarily show this fault, but a static analysis might. HITL or SITL testing can also possibly reveal hidden bugs, but this is only true if your test cases are very extensive and cover all possible scenarios, which is extremely difficult (and sometimes impossible, depending on the code base complexity).
Again, most MISRA rules in my opinion are merely "good programming practice" and does not really have a bearing on safety, but it can catch things like floating point comparisons and unreachable code, as well as dynamic memory allocation during runtime which are all very good things to investigate if you want your real-time code to run deterministically.
When it comes to Paparazzi v.s. PX4 / APM and any other future software compliance regulations we possibly have to follow, I do agree that neither stack has a significant disadvantage because it's really not too much effort to fix minor problems here and there (like float compares, implicit casting etc.) if required.