Quick Tip: Don’t let Dotfuscator auto-sign your assemblies without specifying the key file explicitly.
I spent the last hour beating my head into the wall, again, trying to figure out why Dotfuscator was creating a crap exe. I turned off every option, no help.I figured it must be because I am writing a windows service, so I tried it on a normal exe of mine.
Everything worked fine there. The pain , oh the pain. Well, it finally dawns on me to try turning off my digital signature on my service. Voila! Dotfuscator no longer create the crap exe that threw awesome KernelBase.dll exceptions.
So, it turns out the Dotfuscator’s auto re-sign sucks. The trick is that if you do the re-sign, make sure you explicitly specify the signing key path.
I have seen a ton of crazy workarounds for handling this seemingly simple task. It turns out that most people change the project targets and other nitty gritty manual edit file changes. It turns out the way I am doing it is much easier.
Visual Studio gets really annoyed when a tool returns anything but zero,and it has good reason. The compiler directives tell it to do that in your project file. So, what the remedy, make every tool return zero…Impossible you say! Nope. Easy.
So, it’s so easy it will probably annoy you. Take your scripts and put all the calls into a batch file. Make the last line of the batch file EXIT 0
Yup Exit 0 <– thats a zero.
Here is the one I use to disable my service when I build, so I don’t get the annoying file in use error.
net stop "Your service name here"
So, you get the old BadImageFormat error when installing your service. If you aren’t writing C++ extensions, then this is probably because you forgot to use the right version of InstallUtil for your target platform.
i.e. x86 build using x64 InstallUtil or vice versa.
Easiest fix? Just build for Any CPU. Then you really don’t have to think about it too hard.
I put up my fix for the McAfee bug and got slashdotted. Ouch, awesome to have the exposure but my web server crashed. I just had it rebooted, so it’s happy now!
I guess my virtual server, which I expected to be decent, was no match for the traffic spike.
Well, thanks for listening. I hope the fix is helping you guys out, it definitely threw thousands into a spiral of WTF. I still can’t McAfee would screw up this bad.
I was able to fix the virus definition debacle McAfee that is hosing a bunch of people.
It turns out, that they put in a bad signature that quarantines your SVCHOST.exe which could cause your explorer to crash. Good times, right?
So, the fix is pretty easy:
1 Restart into safe mode with networking
2 open a Command window. If your explorer isn't started, hit CTRL - ALT - DEL and hit Task Manager. Hit File, run. CMD.EXE and enter.
3) type DEL C:Program FilesCommon FilesMcAfeeEngineavvscan.dat
4 type cd c:windowssystem32dllcache
5 type copy SvcHost.exe ..
6 Restart your PC. You are good to go!
NOTE: If you need help getting into safe mode, click here. Or, pull the power cord during the boot after seeing the LOGO. Then boot normal, which will give the boot mode option screen. Pick Safe Mode with Networking (Without networking, you can’t use your cached domain account).
The new .NET 4.0 changes the way that the page validation occurs. It puts in validation in more places now which can cause some issues.
<httpRuntime requestValidationMode="2.0" />
If you want to revert back to the old validation, just make the following change to your web.config.
Make sure you set the validateRequest = false on the page or in the Pages element.
You can check out what Microsoft has to say here.
I had a colleague that kept getting errors doing their LDAP queries. I realized that most developers don’t know about the sticky gotcha with the LDAP. Your url has to have the LDAP capitalized. Crazy right?
So your query string:
Nice gotcha, right?
I upgraded my project with the latest Telerik Q1 2010 assemblies and went to .NET 4.0 at the same time. Well, almost all of my pages that would issue a redirect after they completed their work started failing.
I spent over a day trying to figure out the source. I watched Firebug showing the server responding with gibberish on all the redirects. The symptom was unintelligible response that the page didn’t seem to even be phased by.
Well, I started to think why the hell was I getting binary data streamed back? Should have been a simple redirect header. It finally dawned on me, I loaded the Telerik compression module a while back to try and speed up page delivery from their psychotically bloated pages.
I pulled out the compression modules and Voila! It all started working as it should. What a pile of crap that was!
Anyway, I hope this helps someone else out their pulling their hair. No searches of mine found anything even remotely useful.
<!--<add name="RadCompression" type="Telerik.Web.UI.RadCompression" />-->
I spent the last hour beating my deciding on which way to destroy my computer, as I battled why my Telerik RadGrid was failing to show the filter data set from my EntityDataSource. (Yes, lazy using these visual controls, but for reports they work well and easy)
Turns out, I had Filtering ON on the data grid. Apparently that disregards the WHERE filter on the data source. Well, that’s a pile of crap.
Anyway, turn off grid filtering or provide your data some other way to the data grid.
Hopefully this saves you time.
You thought to yourself, “Self, I think I should sign all of my dll’s with my authenticode certificate!”. Why ,not? It would make the application appear secure with digital signatures and all. I can use publisheridentity attributes for security and all should be well. Right?
The issue comes to the way the .NET loader handles the assemblies, especially when offline.
See, the .NET Framework will go out and verify each DLL’s certificate against the CRL. This requires making sure the CRL list is up to date and can incur an overhead of up to a couple of seconds. If you have a large number of dll’s, which many projects do, this can be quite the expensive task. What happens if your offline, well, it still tries to verify the signatures.
So, will the application run if it can’t verify the signatures? Yes. Yes it will.
Now you ask, “What do I gain by signing my dll’s?” If you are a vendor/publisher, you need it for Vista compliance.
Otherwise, I suggest you simply sign your EXE alone. There is a small performance hit, but if your application requires Elevation at least your customers will see the pretty name for your application and company instead of the dreaded “Unidentified Application” prompts.
You can take it from me, be careful of the performance impacts of authenticode signing your .NET assemblies. (Yes, it even checks them if they are GAC’ed.)
Here is a good reference post.