Dataverse Solution Checker doesn't like PCFs

Riccardo Gregori - Nov 1 - - Dev Community

From quite a while I've notices that for any Solution containing PCF components (doesn't matters if created with --framework react or --framework none), the Solution Checker embedded in Dataverse started returning 🔴 Critical errors.

Initially I thought the root cause was in some of my code, or in any npm package I've added to extend the capabilities of PCFs/WRs but... after digging for a while without success, I tried something different.

🧪 Basic test: a plain, uncustomized, PCF control

I started from the base, thinking: "let's see how it behaves with a plain PCF control, without any custom code in it".
Pretty easy to do:

pac pcf init --namespace My --name Label1 --template field --run-npm-install
Enter fullscreen mode Exit fullscreen mode

First thing to notice here is the amount of warnings it gives (Maybe it's time for MS to update the template 😉):

Warning on pcf init

Let's ignore the warnings and go ahead with the following (Label1is the unique name of an holding solution I've previously created on my env):

pac pcf push --solution-unique-name Label1
Enter fullscreen mode Exit fullscreen mode

During import, the following warning message appears:

Warning during import of label1

Let's go on the actual env to run the solution checker.

Note: There should be a bug on PAC CLI here. Even if I've indicated the --solution-unique-name, PAC created a temporary solution. Don't know why. I had to put it manually in my holding solution... no worries, let's go ahead.

Running the solution checker on the solution that contains only my PCF control, the results are:

1 critical error

This is the critical error returned:

Usage of the JavaScript eval function should be limited where possible. The eval function can be a dangerous function as it allows strings to be executed as scripts within the context of the caller. This can be exploited to run malicious code. Eval is also usually slower than other options due to the lack of optimizations of the script text passed to eval. If this error is reported for a Power Apps component framework code component created using CLI tooling, package your control with 'msbuild /p:configuration=Release' or 'npm run build -- --buildMode production' to produce a release build that does not include 'eval' usage.

😕🤬 Why does pac pcf push does that by default, if it's so easy to solve!?

Simple...because it's partially wrong.

First of all because pac pcf push does the build step for you, and there's no (documented) way to change its behavior.

Second because the npm run build -- --buildMode production simply generates an bunch of files in the /out/controls subfolder of your PCF, that cannot be imported.

To make it work you need to get rid of pac pcf push and manually create a solution to be built and deployed in your env, as described in this article (side note: I used dotnet build instead of MSBuild because it seems not to be working... but maybe it's just an issue on my machine).

IMHO this approach has several disadvantages.

  1. It's painfully slow compared to pac pcf push.
  2. It complicates ALM pipeline.
  3. I found that, even if it works on the first build, if you change something on your PCF and you want to redeploy it to test it, you cannot simply rebuild the solution .cdsproj. It doesn't works. The PCF won't be updated (1), unless you update the version number of the PCF in the ControlManifest.xml.

With pac pcf push it simply works, I never had to update the version of the PCF before redeploying it, and the changes were reflected immediately.

😒 Conclusions

If you're a perfectionist like me, always struggling to score 0 on all checks made by the Solution Checker, finding that the official, documented, approach to work with PCF in a lean way leads to issues... it's quite a pain.

I'm honest, I still haven't found a proper, straightforward, ALM approach to deal with the complexity of enterprise Power Platform / Dataverse / Dynamics 365 CE based solutions.

Any approach (branching strategy, environment strategy, solution versioning, solution packing via pipelines, automated deployment) has his own drawbacks, and need to be challenged against the internals of every different solution component.

All the strategies that I've created myself and/or I've seen applied so far are almost always tailor made for a given, specific, client context.

🤔 What do you think about it?
Do you simply ignore the checks and fight with Fastrack Architects when audit happens?
Or have you find a proper way to deal with this stuff without increasing the complexity of your development model?

💬 Let me know in the comments! 💬


Appendix

(1) For the third point, I tried:

  • Rebuilding the solution via dotnet build ..\..\pcf.cdsproj -c Release and then reimporting it via pac solution import --path pcf.zip -f -pc
  • Rebuilding manually the PCF project via npm run build -- --buildMode production, then rebuilding the solution via dotnet build ..\..\pcf.cdsproj -c Release, and then reimporting it via pac solution import --path pcf.zip -f -pc
  • Increasing the version of the solution (from 1.0 to 1.0.1) before rebuilding, than dotnet build ..\..\pcf.cdsproj -c Release and then reimporting it via pac solution import --path pcf.zip -f -pc
  • Increasing the version of the PCF (from 0.0.1 to 0.0.2) in the ControlManifest.xml, then rebuilding the solution via dotnet build ..\..\pcf.cdsproj -c Release and then reimporting it via pac solution import --path pcf.zip -f -pc

Only with the latter I've seen my PCF actually updated.

. . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player