Got more questions? Find advice on: SQL | XML | Regular Expressions | Windows
in Search
Welcome to AspAdvice Sign in | Join | Help

In the Trenches

Joe's Thoughts

  • InfoPath and MOSS 2007 Useful Links

    u How to do custom themes for MOSS

    u How to create a custom List

    u How to set security permissions for an assembly

    u How to create a code behind page for a master page (keep in mind you got to do all the same security items you do for a web part (web config and all)

    u Another way to do the custom additional menu

    u Some useful Videos (also look at related video’s)

    u InfoPath web part

    u When you need to debug why you are getting an unexpected error, try this

    u Pass input parameters via query string to InfoPath

    u How to use parameters to invoke browser InfoPath form

    u How to turn off tool bars

    u How to use style sheets with InfoPath

    u Useful links entry on some links on performance for InfoPath

    u How to make dependant dropdowns

    u Only good for Client having InfoPath locally)


    u Good for web browser versions (as shown in POC)

    u Overview of Web service support in InfoPath 2007

    u Keep this one in mind when working with the above

    u How to use external data (direct connect to data) in InfoPath

    u For data validation, here is a great regex library link

    u Keep in mind with using Regex. You need to remove the beginning ^ and ending $ from any regular expression you try to use in InfoPath, otherwise you'll get back an error. As well as bracket your parts with ( and )

    u How to conditionally show/hide controls

    u How to create a custom Document Library to hold other types of documents (PDF’s and such).

    u Display charts via excel services

    u InfoPath vs. ASP.NET

    u A simple summary

    u More detailed summary

    u More info

    u List of browsers that are compatible with InfoPath as well as more info

    u A cool tool to convert an InfoPath form to ASP.NET (in case you start off with InfoPath and later need to move to .NET)


  • BD-4 parts needed list

    Looking for parts for my BD-4 Project, this list is updated on a regular basis and parts will be removed.

    Any of the welded items

    2 fiber glass Fuel Sumps

    2 Flush Gas Caps

    Fuel Sealant (whatever you have)

    2 Fuel Drains

    1 Fuel Valve

    2 Wing tips (for metal wings)

    Airspeed Indicator

    Altimeter Sensitive



    Any of the Basic 6 pack instruments (just want them as backup, will have glass cockpit)

  • Setting up IIS to use a UNC network share

    I’ve looked everywhere and have not found a good source for this information, so I figured I would write about it here in a quick note.

    My problem, I have a new server, I created several virtual servers, and I grouped the virtual servers together with their own AD. I then setup IIS on one of those servers, I created a standard website and I went to the “Home Directory” tab and selected the “A share located on another computer” option. Entered in the network directory (\\server\share) and clicked the “Connect As…” button.  I then enter in the login and password of an account on the host machine to access the network share.

    Now everything is going along pretty well, but here’s a little more info. The host machine is in a workgroup, it’s my home machine and I want to have several sets of virtual server domains.  I can run quite a few of them with dual (physical) quad core 2ghz AMD 64bit processors and 32gigs of ram plus 2tbs of space. Though I left the host in a workgroup to keep it neutral (I thought).

    So how does one pass though the security that was entered in the above tab? I kept getting server 500 errors and couldn’t figure out why. I had the account setup on the host; everything was setup fine on the virtual box. Hmmm then I read this little note:

    “Pass-Through Authentication in a Workgroup Environment


    In a workgroup environment, all user accounts are local. Pass-through authentication using Basic authentication can still function, as long as both the IIS and file servers have user accounts with identical user names and passwords. This configuration quickly becomes an administrative burden and consequently is not widely implemented. For these circumstances, designating a single user account designed specifically for use with the UNC connection is likely the best choice.



    This was found via this link:

    So, a nice simple issue, but I’m running a domain and a workgroup and talking between the two. No matter it seems that they are treated the same. Hence I created an account on the local virtual web server that is named exactly the same as the one on the host (same password). Boom another error. I had to give write access to the “c:\windows\.........\Temporary ASP.NET Files” folder for the local account on the virtual web server. Then all was well with the world. I’m up and running.

    Now since this is just for my experimenting and home stuff, I wouldn’t advise anyone to do this for a production environment.

    I also want to take this time to talk about another tool.

    I created a base image and then used differencing disks for all my virtual servers. Well this made them all have the same SID when I went to add them all to a domain. I had to use this tool found here:

    Called NewSID.exe, I ran it on each of the virtual servers and boom all was well. So if anyone has this problem when setting up their test environments, I hope this is of some use.

  • How Microsoft Saved my life

    Many of you, who know me, know me as a big guy. At 415 pounds I was a very big guy. I would speak around at conferences throughout the region, sometimes I think it’s my size that everyone finds easiest to remember about me. I’ve tried just about everything and changed my eating habits, changing all my habits had only stabilized me. I’ve been about the same weight give or take a few pounds for a couple years now.

    I’ve tried to go though the insurance plans of previous employers but there was always so much red tape. I was once told that the reason (this was from a rep of the insurance company) that they are hesitant to cover such operations is because most people don’t stay with the plan long enough for the company to see the benefits.

    Shortly after I started with Microsoft I got a notice that my previous insurance wanted me to go though some additional hoops they didn’t mention before. I laughed since I no longer had them, but it was just another let down and I felt really bad.  So I called the doctor’s office and they talked to me about what the insurance company said, I told them well I’m no longer with them, do you want my new insurance information.

    So they resubmitted the claim and I got a call a month later that they were going to schedule my surgery. I was like WHAT? No hoops to jump though, no long wait lines, I was like “are you sure?”  I really could not believe my ears. At the same time I was in the middle of training in Redmond, I had to share the good news with all my new found friends.

    I had commented on how well Microsoft takes care of its employee’s and how much they invest in training and time into their employee’s.  Though I found out how Microsoft does their insurance, and they expect employee’s to be with them a while, hence they are not afraid to invest and have the things done that need to be done.

    I have just come home 1/29/08 from surgery, I’m very sore; I had the lapband surgery done where they put a band around your stomach to control the amount of food eaten. I’m very sore as I type this, but I’m also filled with such happiness. I don’t know what else I can say. For a long time, diet after diet, I felt like I was doomed to die in my 40’s (avg death rate for most on my mother’s side).  I feel like the possibilities of a much longer future are open to me now.

     I’ve been overweight since I was a little boy. People just don’t know the pain one feels when living as an overweight person, I know I’m a smart guy, but I wonder sometimes how much my size had played in holding me back? The perception that people have of overweight people is one of lazy and clumsy and not very bright. I’ve had to work probably extra hard and be extra detailed in everything I do to fight this perception.

    I’m happy that now with the right controls in place and the right determination in place that I can reach my goal weight and finally my outside will match my inside. I’m thrilled with the prospect of finally realizing my lifelong dream to being a pilot and look forward to starting the training here in Tallahassee when I reach or get close to my goal weight.  I’m also happy at the prospect of meeting my grand children one day, something my mother wasn’t able to do.

    I’m only sad for one thing, the fact that I will miss the 4th annual Code Camp South Florida. I’ve talked and attended since the first Code Camp South Florida, but not this year. My mind has been so focused between work and this operation and now I’m so sore you just can’t imagine. I feel very sad for missing it and I will miss seeing everyone attending. I hope to see many of the people I know at the following code camp in Florida.

    To all those who read my blog and that attend the code camps in the region, I look forward to seeing you.

    I don’t know what else to say, there are so many people that helped me along the way that all tie into this that I don’t know how I could ever thank them enough. For some people it’s just a job, for me I believe it saved my life. Time of course will tell, but finally I have hope for a real solution for a future.

  • Now at Microsoft!!

    Hi Everyone,

    Well this is just my announcement to the world.

    I’m now working for Microsoft as a consultant in the field.

    I started Oct. 15th 2007 and man my head is still whirling. There is so much to learn and so many resources. It’s simply amazing the resources Microsoft places in the hands of its employee’s. It just shows how valuable their employees are to Microsoft.

    In my previous experience, I might have had a single video and maybe an employee handbook; with Microsoft it’s more like 4 weeks’ worth of online training, plus 3 weeks at MSU (Microsoft University).

    While other company’s give you a laptop, a cube and walk away, not Microsoft. Granted I did have to install all the tools I believed I needed, I was told where everything was and given a roadmap.

    Granted I’ve only been here for a couple weeks, I’m just so excited; I’ve never felt as valued for what I bring to a company before. Within 1 week I was already working on my first project, which I’m leading and could be a pretty major project.

    If you ever thought about working for Microsoft, I encourage you to give it a go. If you thought MS is some bloated company that can’t move fast, you would be so surprised. The amount of resources that Microsoft employs to better service its customers and research and development is just unbelievable.

    Also as a previous Microsoft MVP I have always been a big supporter of Microsoft and it’s products. While I’m now at Microsoft I plan to continue my activities in the .NET community here in the South East. I will be at the SQL Saturday in Orlando coming up this November 10th ( I encourage all to attend. While I’m no longer an MVP, I look forward to a very active FY08 in the community.

    I’m hoping to use some of my new found resources to help the community in ways I couldn’t before. I’ve been thinking of some interesting topics for FY08; please provide your vote on any of them.

    ·         LinkQ, how does it affect DAL/BL development and how can you employ it to the max!

    ·         LOB (Line of Business) WPF, bringing 3.5 together

    ·         JavaScript to the Max in VS.NET 2008, also Do’s and Don’ts

    ·         InfoPath, SharePoint Workflow and Moss, bringing you solutions faster

    These might not all be ready by the start of FY08, I have three weeks of solitude in Redmond coming up and I plan on using every bit of it (after classes and other) to get a good start on these subjects.

    Please post a message of support for any of these subjects, or if you think I’m wasting my time on any of these subjects or have a subject you been dying to get covered, then let me know.

    Till I see you all again, have a great one. I will of course keep you all up to date in my journey in Microsoft.


  • Ajax Update Panel, Not all it’s cracked up to be (at least sometimes)

    Ok, I’ve found a nice little bug, it’s rather simple.

    Have a web form with an update panel (regardless of how the update panel is setup). Also add a place holder inside the update panel.


    Then build your self a nice little user control and place a simple text box on it, then add a simple Compare Validator. Have the Validator do a data type check on double or something. Add another textbox so you have one to go to after testing the first textbox. Also put in a button, it doesn’t need to do anything, just a simple ASP button.


    Now on load of the page, have the user control get dynamically created via “LoadControl” method. Have it loaded into the controls collection of the Placeholder in the update panel.


    Now enter some alpha text into the text box and note how the validator won’t work. Also note how when you click the button which BTW is inside a user control which is in a panel which is in an update panel and has no code at all behind it. Notice how it will cause whole page post back.


    I believe this is because of some short circuits the update panel and instead by passes it and acts as if it’s directly on the form.


    Hence this statement direct out of the MSDN Doc’s:

    When you load a control into a container control, the container raises all of the added control's events until it has caught up to the current event. However, the added control does not catch up with postback data processing.


     So it seems only containers have this LoadControl method, hence why you would see some of the results that you see. For one it seems the Javascript inside the Validator controls get lost or never registered. That and all post backs hit the form like if the update panel didn’t even exit.


    Now do the same thing but instead of loading the user control we just place the user control inside the update panel. Now run and see how it works, you will find everything now works and the whole page does not post back when the button is clicked.


    This is the expected behavior.


    Hence the lesion learned here is not to use dynamic user controls with update panels.


    In my next post I will go over how to do a lot of the things you would do with the Ajax framework, all without Ajax.


    For those of you who might be wondering, I’ve been having my share of fun with the road home project in Baton Rouge, Louisiana. If you have never been to Baton Rouge, well you are not missing anything, trust me!


    The road home project is a series of applications and programs for getting people back to Louisiana. As some of you might now, after the storms people have been spread across the country. This project was put in place to help people back to their home sate and home cities/towns. It’s quite an interesting project to say the least.


    I’ll be in Tampa doing a talk or two at the Tampa Code camp this July 14th, see you there!

  • WSE 3.0 How to Setup Mutual Authentication

    Now there are plenty of posts out there that deal with WSE, though we are going to concentrate solely on Mutual Authentication.

    Mutual authentication requires certificates to be in both locations, now it is possible for use of a single certificate. The problem is once the single certificate is compromised then the whole system is compromised, we will go deeper into this soon.

    Let’s look at a simple setup:

    Now we want to make it so that all communications between this client and server are secure and only this client can talk to this service.

    In order to achieve this, we must create 2 certificates that both contain a public/private keys as well as Digital Signature,  Key Encipherment Data and Encipherment. We will go though how to create this type of certificate later.

    Hence here is how we have to break down the certificates.

    Now in order for Mutual Authentication to work, the Service (server side) needs each client you plan to allow to connect (Authorization, Authorized Clients). The client only requires its own certificate and the public key certificate of the server.

    Now first we need to have .NET 2.0 and WSE 3.0 installed as well as VS.NET 2005.

    Let’s take a look at what the client’s web.config looks like (and what it must contain), when you run though the WSE 3.0 wizard in VS.NET you will get several things added to your configuration file. One of those items worth mentioning is:


        <policy fileName="wse3policyCache.config" />


          <x509 allowTestRoot="false" verifyTrust="true" />



    This tells WSE3 what policy file to use; it also lets it know if we are doing this in test and if we should verify the trust of the certificate.

    If Trust is set to true then the issuer of the certificate has to be a trusted source (as well as the issuer of the Server certificate). If you run into trust issues on the client of the server certificate, you can always add the certificate to the trusted people store of your certificate stores. One way to have an un-trusted certificate is if you use your own CA to issue certificates and the foreign computer that it is imported into is not in your domain. Since it’s not in your domain, it can’t walk the certificate ladder to verify if it trusts the issuer of the certificate.

    There are times when you will need to debug what’s going on in your web service. You would simply add this line and place it as follows:




          <trace enabled="false" input="InputTrace.webinfo" output="OutputTrace.webinfo" />


        <policy fileName="wse3policyCache.config" />


          <x509 allowTestRoot="false" />




    Setting the “Trace Enabled” to true will now save all information that either hits or is sent to the service. Hence if you turn this on the web service side, then input = data coming in from client and output = data going to client. If you turn this on the client side then input = data coming in from service and output = is data going out to service. Most problems will generate a WSE910 error, hence without this trace information you will go insane trying to find out what the problem is.


    Now let’s look at the policy file, it is also possible to have more then one policy, for instance you have a client application that calls several web services. Note if all those web services are on different computers and you are using mutual authentication then your client will need the public key certificate for each service.


    <policies xmlns="">


        <extension name="usernameForCertificateSecurity" type="Microsoft.Web.Services3.Design.UsernameForCertificateAssertion, Microsoft.Web.Services3, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />

        <extension name="x509" type="Microsoft.Web.Services3.Design.X509TokenProvider, Microsoft.Web.Services3, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />

        <extension name="requireActionHeader" type="Microsoft.Web.Services3.Design.RequireActionHeaderAssertion, Microsoft.Web.Services3, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />


      <policy name="ClientPolicy">

        <mutualCertificate11Security establishSecurityContext="true" renewExpiredSecurityContext="true" requireSignatureConfirmation="true" messageProtectionOrder="SignBeforeEncrypt" requireDerivedKeys="true" ttlInSeconds="300">


            <x509 storeLocation="LocalMachine" storeName="My" findValue="CN=Clntcert" findType="FindBySubjectDistinguishedName" />



            <x509 storeLocation="LocalMachine" storeName="My" findValue="CN=Srvcert" findType="FindBySubjectDistinguishedName" />



            <request signatureOptions="IncludeAddressing, IncludeTimestamp, IncludeSoapBody" encryptBody="true" />

            <response signatureOptions="IncludeAddressing, IncludeTimestamp, IncludeSoapBody" encryptBody="true" />

            <fault signatureOptions="IncludeAddressing, IncludeTimestamp, IncludeSoapBody" encryptBody="false" />



        <requireActionHeader />



    Ok, as you can see here we have a “ClientPolicy” setup here, now we are not going to go though everything, but what I wanted to point out that causes the most trouble is that everything here (including the “extensions” at the top) needs to be the same as the servers policy file. Hence if you do something as simple as setting the “establishSecurityContext=”false”” and the server has it set to “true” then boom your connection between the client and server will fail. Now the differences are what certificates to use. Here you see the client’s certificate is “CINTCERT” and is what is used for the client token, then for the service token we got “SRVCERT” which is the public key certificate on the client of the server.

    As you can see, it’s very important that settings match up, any one setting will cause a 910 error that means NOTHING AT ALL, and it’s so general that there are hundreds of things that could cause the error, including the two machines being out of time sync.

    Another common error is that it can’t read the private key because of access security issues. Hence if you run trace it might tell you that “Object contains only the public half of a key pair. A private key must also be provided” now this could be a messed up certificate, or it could be that it can’t read the certificate. Hence the only way to fix this (after you verified that the client certificate has a public/private combo) is to give “Everyone” full rights to the “C:\Documents and Settings\All Users\Application Data\Microsoft\Crypto” folder. Yes I know this isn’t nice, but I have personally tried giving proper accounts (Network Services and such) the rights needed and there is something, somewhere that keeps things from not working. Setting “Everyone” to full control fixes the problems. Now you must also hit the “Advanced” button and replicate the permissions to ever sub folder and item as well. This is also what was found via several searches on Google, many other people had to resort to the same extremes to get things working.

    Now if I sound like I just know how to make it work a not how it works, you are right. Many long days on the phone with MS and many long hours finding solutions to issues have taught me one thing. Change 1 value and nothing works, it’s that simple. Just 1 value change and you get a very vague error and spend hours to track it down. To the point that you just concentrate on making sure things are the same all the way across the boards.

    Let’s take a look at a service’s Policy file.

    <policies xmlns="">


        <extension name="usernameForCertificateSecurity" type="Microsoft.Web.Services3.Design.UsernameForCertificateAssertion, Microsoft.Web.Services3, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />

        <extension name="x509" type="Microsoft.Web.Services3.Design.X509TokenProvider, Microsoft.Web.Services3, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />

        <extension name="requireActionHeader" type="Microsoft.Web.Services3.Design.RequireActionHeaderAssertion, Microsoft.Web.Services3, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />


      <policy name="IEPUserGroupList">


          <allow user="CN=IEPWSE" />

          <deny user="*" />


        <mutualCertificate11Security establishSecurityContext="true" renewExpiredSecurityContext="true" requireSignatureConfirmation="true" messageProtectionOrder="SignBeforeEncrypt" requireDerivedKeys="true" ttlInSeconds="300">


            <x509 storeLocation="LocalMachine" storeName="My" findValue="CN=Srvcert" findType="FindBySubjectDistinguishedName" />



            <request signatureOptions="IncludeAddressing, IncludeTimestamp, IncludeSoapBody" encryptBody="true" />

            <response signatureOptions="IncludeAddressing, IncludeTimestamp, IncludeSoapBody" encryptBody="true" />

            <fault signatureOptions="IncludeAddressing, IncludeTimestamp, IncludeSoapBody" encryptBody="false" />



        <requireActionHeader />



    As you can see here, much of it looks the same (such as the Service Token). The big difference you will see is the “Authorization” section. Here we can set who’s allowed to use this web service.

    The “<allow user="CN=IEPWSE" />” section sets the allowed certificates, to allow more, just keep adding more of these sections. Now the “<deny user="*" />” section pretty much says don’t allow anyone, of course unless they been allowed already in the allow section.

    Lets talk a bit about creating Test Certificates. We can easily create test certificates via the certificate creation tool “Makecert.exe” (ref:

    We simply create a couple test certificates:

    makecert.exe -sr LocalMachine -ss MY -a sha1 -n CN=DevServer -sky exchange -pe DevServer.cer

    makecert.exe -sr LocalMachine -ss My -a sha1 -n CN=DevClient -sky exchange –pe DevClient.cer


    One thing you want to be sure about is you set this option in the web/app.config:

    <x509 allowTestRoot="false" verifyTrust="true" />


    <x509 allowTestRoot="true" verifyTrust="false" />


    On both sides (web service and client side), this way these certificates created above will work. Note this is strictly only for testing.


    You still need to make sure your service or application's security account has sufficient to access the certificate's private key. The “WinHttpCertCfg.exe" (ref: tool can help grant access permission for certificates; you can find it in the WSE 3.0 SDK's installation folder.


    Now lets talk about creating real certificates, the following outlines how to create certificates from a certificate authority.



    You are using WSE 3.0. Everything was working fine during development on another client machine

    However, that dev setup is not valid on the network. So, you are setting up new certificates on both sides.

    You are using WSE for authentication

    When setting up the WSE configuration, you get this error on the client when selecting a certificate:

    "Selected Certificate does not support data encryption"


    WSE 3.0

    Windows 2003

    Troubleshooting /Resolution 

    1. From

    Error message - Security token does not support Data Encryption.

    Cause - The Key Usage property of the certificate does not include Data Encipherment.

    Remedy - Use a certificate with a Key Usage property that includes Data Encipherment.

    2. We reviewed the certificate and the key usage settings are  - Digital Signature and Key Encipherment.

    The certificate also needs to include a key usage of Data Encipherment per the MSDN documention on WSE 3.0

    You were not certain how to get such a certificate, but had access to the CA Server.

    3. We discussed the following article but it did not quite help.

    273856  Third-party certification authority support for encrypting file system;EN-US;273856

    4. We discussed how we can request a new certificate from the browser by going to http://CAName/certsrv

    We can select "Other" for "Type of Certificate needed" and then put in this identifier for OID "" and get a certificate that has Data Encipherment as a Key Usage

    However, your web interface for requesting a certificate looks very different.

    You do not have the same options when requesting certificate from browser.

    You have an option to "Select Template", options in the drop-down menu include "Basic EFS", "EFS Recovery Agent", "User", etc.

    We tried each of these options, however none of them issue a certificate with Key Usage of "Data Encipherment"

    5. So, it looks like the CA setting of the templates would need to be modified.

    You have full access to the CA server.

    6. We looked into possible ways of modifying the templates. We discussed that none of the built-in certificate templates issue a certificate with "Data Encipherment" as a Key Usage and that is precisely what WSE needs.

    Here are the steps we took to create a new template with a Key Usage of “Data Encipherment”.
    1. Open the certificate template on the CA Server machine - certtmpl.msc

    2. Find a template that is close to the type of template we want to create, right click on it, and select "Duplicate Template". In our case, we selected the "Web Server" certificate to duplicate.

    3. Give a new name to this new template and make the modifications necessary:

    a. On the General tab, Select "Publish certificate in Active Directory"

    b. On the request handling tab, from the "Purpose" drop down list, select "Signing and encryption"

    c. On the Extensions tab, select "Key Usage" and hit Edit. Under Encryption, the second radio button should be selected "Allow key exchange only with key encryption". Under this select the checkbox for "Allow encryption for user data" (this is what gives the "Data Encipherment" key usage). "Digital Signature" should already be selected; if not, select it.

    d. On the Extensions tab, select "Application Policies" and hit Edit. Then add "Client Authentication",  if it is not already in the list of "Application Policies"

    4. Open Active Directory Sites and Services. Go to view and select "Show Services node" if Services node is not visible.

    a. Expand Services, Public Key Services, Certificate Templates and select the new template

    b. Go to the properties of the new template and go to the Security tab. Select ENROLL for the appropriate user(s).

    5. Go into Certification Authority MMC and Right Click on "Certificate Templates" and select New->Certificate Template to Issue. Select the new template

    6. Now, you should be able to browse to the certificate server's web interface (http://localhost/certsrv) and request the new certificate type.

    We walked through the steps above and this seemed to work!, Now, you were able to get a new certificate with correct key usage.

    So, we got a certificate with following properties:
    Key Usage  - Data Encipherment, Key Encipherment, Digital Signing

    Purpose – Client Authentication, Server Authentication.

    Root Cause

    The certificate being used did not have a Key Usage of “Data Encipherment”


    Error message - Security token does not support Data Encryption.

    Cause - The Key Usage property of the certificate does not include Data Encipherment.

    Remedy - Use a certificate with a Key Usage property that includes Data Encipherment.

    Related Knowledge Base Articles


    Tips & Tricks

    Here we are just going to list out the things that cause most of the problems after you get things setup.

    • MAKE SURE YOUR URLS ARE CORRECT (to the service you are calling)!
    • Make sure client has Public half of Server Certificate and server has public half of Client Certificate.
    • Make sure your policy files are configured (as far as options) exactly the same on both the client and server.
    • Be sure to either give full control to the crypto directory “C:\Documents and Settings\All Users\Application Data\Microsoft\Crypto” to Everyone or use the “WinHttpCertCfg.exe" utility.
    • WSE910 can be anything, turn on tracing on both the client and server
    • Make sure the client and server clocks are in sync, they can not be out of sync since soap messages are time sensitive (expire times).
    • Be sure that the certificates you create support “Data Encipherment, Key Encipherment, Digital Signing”.
    • If you are using certificates that cross domains and have your web/app.config option “VerifiyTrust” set to true, you must have all the client certificates in the “TrustedPeople” store. If you don’t then they won’t be trusted and you will get a WSE910 failure.
    • Unit test your web service before enabling certificates, if it fails for non-WSE reasons, it will cause a WSE910 error.

    An important thing to keep in mind at all times is that this is a message level defense, hence people can still hit the service as well as see it’s methods, they just can’t execute any methods.

  • Chars and percent signs %%

    Who would have thought?

    I needed a stored procedure that could take several possible values but some would not be passed. I’m sure some of you found better ways to do this. I’ve seen some insane ways of using XML and creating dynamic SQL to do this, but I didn’t want any dynamic SQL.

    So I had this as a list of parameters

    @CustomerID char(5) = null,

    @CustomerName varchar(100) = null,

    @CustomerAddress1 varchar(100) = null

    This is just a sample of parameters, could be much more or less.

    Then I have my “IF” statements.

    If @CustomerID is null

                Set @CustomerID = ‘%’

    If @CustomerName is null

                Set @CustomerName = ‘%’


                Set @CustomerName = @CustomerName + ‘%’

    If @CutomerAddress1 is null

                Set @CustomerAddress1 = ‘%’


                Set @CustomerAddress1 = @CustomerAddress1 + ‘%’

    Ok, now let’s look at the select statement:

    Select * from Customers


                            CustomerID LIKE @CustomerID AND

                            CustomerName LIKE @CustomerName AND

                            CustomerAddress1 LIKE @CustomerAddress1

    Now if you have never done a SP like this, what this allows me to do is pass any one value or all 3 values and boom the query will return a filtered result set back. I pass back nulls for all 3 parameters then I will see all customers, if I pass back a value for any one parameter then it will filter on that one value, if I send in more then one parameter value then it will filter on how ever many I populated.

    Column LIKE ‘%’ returns everything.

    So why did I write this blog post and talk at the very top about Chars and  %% percent signs?

    Well I noticed an interesting bug (if you want to call it that). If the parameter is a CHAR datatype and you set the parameter (because NULL was passed) to ‘%’ then the query would not return anything. Not a single record, what really puzzled me was the fact that if I manually put in CustomerID = ‘%’ then it worked just fine.

    So I figured there is something with the setting of the parameter (which is defined exactly like how the column is defined (granted I don’t like Char ID’s)). Then it struck me, what if when I do “SET CharColumn = ‘%’” that it actually turned out to look like (’%       ‘) since I only populated the first char, then it will only pass back rows that match this. So I had to do some fast thinking. So then I populated the Char parameter like this:

    Set CharParameter = ‘%%%%%’

    Why does this work you might ask? Well it should look like (‘%%%%%‘) since I filled the chars of a length of 5 with percent signs. Hence nothing is in-between the % signs and it gives back all values. This isn’t a problem with other datatypes that are not fixed length, but it sure was a real headache to find.

    I hope this helps someone else, since I couldn’t find anything on this.

  • SiteMapPath navigation control for ASP.NET 2.0

    What can I say, I’m sure many of you have put this to use, though here I want to publicly complain about a couple issues.


    For one it would have been nice if there was some event that got clicked when one clicked a link. That could have been handled much like how my solution would show.


    The other thing is support for a simple target, yep; you can put in a link but not say a target. Now how annoying is that? I don’t think the guys that developed this control even had framesets on their mind at all.


    Well to my solution.


    It’s rather simple when you think about it, I used the standard method of using this control, and that’s in union with a web.sitemap xml file.


    I then did this to the node in question.

    <siteMapNode url="BLOCKED SCRIPTOpenHome();" title="HOME" description="" >

    Now all that’s cool and all notice it opens home. Well my problem and why I had to come up with this solution is that one of the pages in the application required frames. Yes I could have purchased or did my own little splitter control or something but in the end, all browsers understand framesets (was having trouble with JS in different browsers).


    Plus this was only for one single part of the application which isn’t publicly exposed.


    Also note I did this with master page with the sitemappath control on that page. So I then just put this little JS script on the master page and all was well.


    <script language="javascript" type="text/javascript">



    if (self.location.href.indexOf("PageInFrameThatUsesMasterPage.aspx")< 0)




    if (self != top) {


        top.location.href = self.location.href;







    function OpenHome()




    top.location.href = "";






    Now the script above makes it so I don’t break out of frames on the one page that I don’t want to break out. Now of course I could have just made a small function that takes a string for the URL that I put in the XML. That would have made it even smaller. This is the first draft solution that worked ok for me.


    That’s it, nice and simple, though I didn’t find many answer out there on my problem. I couldn’t use my break out code on the home page since it was another site all together, hence I didn’t have control of that site. Plus don’t try onunload event, that fires on every refresh and postback.





  • Code Camp Jacksonville

    Code Camp Jacksonville

    Well another code camp and all went well, I sometimes don’t like commenting on events, did that once and stuck a foot in my a##.

    We stayed at the Hyatt in downtown Jacksonville, only a few blocks from the event and around the corner from the Saturday Pub Club.

     I got to say it takes a special breed of people to put these events together; the CC Committee did a great job. The event started as early as Friday with the speaker pub club. It was held at Dave and Busters, a nice place, cool games, my only problem was how far out of the way it is (though isn’t everything in Jax?). Jax CC is the only code camp to hold its Pub clubs this year in two separate areas LOL, not bad, just a funny note.

    The Pub club event on Saturday was held at this way cool and very blue “Blue Room bar”, I still love that bar counter. It was a great place, it over looked the river and was right around the corner from the hotel that many of us (from out of town) were staying at. One thing to note for the future (just friendly advice) I over heard some people wanting more items that weren’t so fishy (I was told there was chicken there, but only saw meat balls) and more veggie items for those that don’t eat meat. One other issue is that on the web site, it said the event was at the Copper Cellar. Other then that it was a blast. I salute the CC committee for a great find. I couldn’t imagine a better place, and blue is my favorite color.  Now if only there wasn’t a bucks game that night LOL.

    The actual event went well, as I was a little late myself, I hope all that came enjoyed the talk. I missed the keynote, though I heard it was great or long, depends on who you ask LOL. Lunch was pizza which is great, no lines or they were very short lines. People are there for the session’s right? We served pizza in Tallahassee as well, people grab what they want and go. The rest of the event went pretty well, though I cut out a little early and didn’t see the end. I wanted to run over and get an enhancement to my already existing tattoo. The rest of the event went well from what I’m told.

     I’m already beginning work on a new Game development talk (focused on XNA Express edition) and a new DAL/BL talk using generics and nullable types along with serialization. Look for them in next years line up, I’ve heard the New Year will start with Atlanta, wow that will be cold, should make the hotel rooms plenty. I know if they have it in Decatur again I am staying downtown, not going on that adventure again on memorial drive LOL. No hourly hotel rooms for me.

     My talk was made possible by the generous support of my employer (Idea Integration). Their support in my community work has been great. It’s truly great to work for a company that realizes the positive benefits of community support. Idea has been great to work for so far, like any company they have their issues. I’m happy with them, as I hope they are with me.


    I’ve taken over all development for ConnectBlocks, and working on a new version that will be scaled down a little on first release set for late 2006 or early 2007. It will be based on the 2.0 framework, and it will be database driven. The formal wasn’t and offered a ton of problems when working on projects as a team. I didn’t originally want to go the file route, but got over ruled. Not anymore!

    Expect the first 2.0 version to support more then one developer, with built in source safe abilities. Also expect the MSI builder to be more customizable as well as the HTML compiled help builder, report builder and the DAL builder. We had talked in the past about CB being able to deploy to the web, that’s still in the works, though it’s been a bit more complex then originally thought out. Once these other areas are revised, work will begin on the web deploy. Several factors have come up, including the fact that not all win controls have a counterpart on the web (not one that is free or comes with the framework). This will take additional time. In the mean time, several portions of the application will start to be available separately (Report Builder, DAL Builder, MSI Builder and Help Builder).

    I’m adjusting the main portion of the application for working with the enhanced versions of the previously mentioned builders. They will be released separately as they are done, and when all is done, then you will see a new version of the total ConnectBlocks project.  This will allow everyone to start using these portions along side other IDE’s such as Visual Studio, there is some debate as to if these should be plug-ins, that is possible and it might be done later, but not at the start.

    One other very important feature will be the Import and export to and from Visual Studio; I know many of you have had issues about wanting to use CB for any real projects because of feeling locked in. That’s not the case after you can export it. The ability to import will also allow you to use some of you same forms you use now in your existing projects.

    Well that’s all the updates, and about my latest Code Camp experience (Great Job Jax CC Committee). I have another conference in Nashville, TN for Oct 13th (DevLinks) it looks to be pretty cool.

    If any of you have any questions, feel free to contact me. 

  • ASP.NET Treeview 2.0, Javascript madness and more

    What can I say, I’m not happy.


    First let’s look at sub classing; this must be an issue with more web controls. You would think they (ASP.NET Team) would have made the controls smart enough to serialize to view state the exact types in the object.


    Here is what happened to me, I used the treeview control, I sub classed the treenode and added a couple new simple properties. Well I expected on post back to up cast the selected node to my sub classed treenode. Guess what happened, it was just a regular treenode. It appears on postback that it reconstructs the treeview but uses base classes to do it.


    If this is going to be how things work, why bother allowing sub classed objects, I know it might be a pain but give some warning. It appears that the only way you can have a sub classed treenode is to customize the control to control how it serializes to view state and un-serializes from view state, this is way to much WORK!!!!!


    So needless to say, don’t subclass a control and expect view state to work right away.


    Some other interesting things I have found that drive me crazy. Why doesn’t Firefox support ondrag event, the only solution is a crazy mix of using onmousemove, onmousedown, onmouseup and onmouseout. Just to duplicate the same effect. So far I have looked at several frameworks, some of which I want to make known here. A good friend William Rawls had passed me these interesting links.


    using prototype: or
    zend framework:


    Now you might be wondering how I got into the whole ondrag issue rant. Well I was trying to resize a table column (at runtime). Now I didn’t want to use frames (which would have worked) because, everyone seems to hate them or recommends not using them. The one excuse I hear is because it doesn’t index well. My main problem is that I wanted it to work within a usercontrol. The solutions I first came up with worked with MS IE. Then after trying it in FF I found out it didn’t work. So I revise all the code and try to get it to work using several of the events above, still no worky, there are so many differences in how IE reads JS and FF(and others like FF) read JS. There really needs to be a good cross browser IDE for this stuff, I’ve did a quick search on google but nothing popped up.


    My solution came in the form of this nice little DnD JS framework with other features.

    Now I hadn’t explored all of its wonders yet, but I really like it.


    I also want to note one other site:


    This site seems to have a load of nice JavaScript frameworks all indexed on this page.


    Ok, my last beef. What are they thinking? I’m here plugging along in VS (visual studio 2k5) and I want a treenode when clicked to hit an IFrame to load a page. Well I set the target to the ID of the IFrame and the navigateURL to what I want loaded in the IFrame. It never works, so at some point I remembered that you have to set the Name property of the IFrame, now I had it like that originally but VS complained that it didn’t meet XML 1.0 something or other framework. Also that it was considered outdated, but guess what, if the name is set then all works. So I don’t know, why say something is outdated if it just doesn’t work any other way. Not sure if this is a Treeview issue or what. I tell you something else. If I have a value set for the treenode, I expect that when clicked that the value will be sent to the IFrame as a query parameter (but it doesn’t) I have to set what I want sent, which bites.


    I hope this helps others, as my adventure back into ASP.NET development is painful at best. I still don’t know why people put up with such limited interfaces or why the controls we have so far to work with from MS in VS are so crude. Not one tabbed control, not one splitter control. Not even a decent JS framework for use in our apps that would take care of a lot of this.


    Also did I say how much I hate how some third party vendors sell their wares, really! Think about it, most of them sell it at not per developer license but rather per site (though keep in mind each developer that runs stuff on their on PC has to have their own version). This totally doesn’t work when you do sites for several clients, each are required to pay that price. Doesn’t anyone else see this as unfair pricing?


    Anyways, these are my thoughts and they do not reflect the thoughts of my employer.


    I also want to note, if anyone is looking for a job, I know several people looking, including my new employer.


    You might be wondering how things are going with my new employer, well they are going good. I do worry though, had some thoughts passed along that disturb me. Though for me, everything has its place and use. For I know where I want to go in the coming years. Either I would like to work for MS (designing IDE’s or controls/components for upcoming versions) or getting my www.connectblocks project off the ground and get that to start producing.


    For those of you who wonder how things are going with my project, wonder no more. It’s still being worked on and I’m taking full control over the project myself. This means that as soon as some of the bugs get worked out and the documentation is done, it will go to market, look for it either late this year or early next year. Though you can download the beta at anytime.


    Enjoy your life, you only have one to live.






  • Idea integration

    It’s amazing the power of networks, and the simple act of knowing a lot of people. Being involved in the community made a transition in the midst of a tragic personal loss not only quick, but seamless.  To everyone that helped me reach a solution, thank you from the bottom of my heart.


    I’ve received an excellent opportunity from Idea Integration, one which will allow me to live/work in Tallahassee and continue to do my community work (User Group code camps and conferences).  Idea seems to be very cutting edge and I’m really excited about getting in and learning a bunch of new concepts. The shiny side of that coin is that I will continue the work on ConnectBlocks and the work I was doing with Jay – my friend, partner and greatest supporter.   Jay, as you all know from my previous entry, has passed on, but in his religion he hasn’t passed on to some other place, he’s passed on to another life on this earth. I will deeply miss him and all he had done to help me. His funeral service was on Monday with the cremation on Wednesday. It was very moving, it was pretty hard on me personally, I’m not normally so sensitive plus I never cry in public, but man I just couldn’t control that one.  


    BTW, Jay would scold me if I didn’t mention the newer version of ConnectBlocks available for download and the latest news.  Well the one thing I was working on before I got flooded a few months back, was a deployment option to deploy the form to the web. I have a cross lookup table to match up controls (win to web) and to find compatible properties/methods. This way a web form will look pretty much like a winform and behave the same way.


    I had built a new DataSourceControl that has a winforms currencymanager object within it, this allows me to replicate the same kind of smart client state but within a web project. Who says you can’t have state? It also controls all the binding, there will have to be some new controls (datagrid) since some just don’t behave like I would want them to. There will be a large use of Atlas in the generated output (this will keep the post backs to a min and hence give you a more windows app experience). This is of course all coming in the 2.0 version of ConnectBlocks, which is also built on the 2.0 framework. Look for it later this year (December or so).


    In other news, Code Camp Tallahassee 2006 (2nd annual event) went pretty well. We had a great party for the speakers and a great time after the code camp.


    Things we need to do for next time.


    Make sure it’s not on any holiday/football game and far away from any big MS event (TechED).


    Have it either in the spring or fall (we want to have the next one in Sept of ’07).


    If any of you that attended had any thing you would like to say, please let me know, I would enjoy any and all feed back possible. We hope to make next year’s code camp even better.

    I want to personally thank all of our contributors, as well as all of our volunteers and the Code Camp Committee for all their hard work.  I also want to thank Microsoft and Joe Healy for all his help, you can find him very active at  It’s people that make such an event possible. I look forward to next year’s event.


    Well, time to take the little one to summer school, look forward to meeting any of you that might be coming to the July 6th meeting. We have a great speaker lined up (


  • Jay Karnik

    A very good friend and business partner has passed away.


    I’ve known him since Aug. of 2001; he was the funny old guy in the corner office. People didn’t seem to pay him much mind and I didn’t have direct contact with him at that time.


    Shortly after starting work with the company, 9/11 happened. We had clients in the building and it was the slow decline for the company after that. As we shrunk in size we became closer. We moved to a new office in 2003 and I shared an office with Jay.


    I had not gotten the pleasure to know the man for the first year and a half I worked for the company; he kind of stayed to himself. It wasn’t until our first business trip that we really bonded. Away from everyone and staying in NYC, there weren’t many distractions. There was nothing to do but walk the streets, find a good bar and/or restaurant. I really got to know him and we talked a great deal about many subjects from religion to politics.


    Since then we took many business trips and many code camp trips around Florida. Jay was also a big FSU booster and a golden Chief (means he gets great tickets). I had gotten my season tickets from him for the past two years. Last year we had some great tailgating fun. I would bring the Rum and Coke and he would bring his grill and sausages. We had a lot of fun last season and I was dearly looking forward to the next season.


    He became a good friend. He was there when I needed him. You could ask anything from him and he would do it if he could. We will all miss him dearly. He was a great story teller, I only wish I could get a chance to experience life the way he did.


    As a young man he traveled around the world. He was in the USSR for a bit of travel as one of his stories goes. I would hear his stories at 2am while drinking rum and coke. They would always take you away to another time another place, though it could have also been the rum.


    He had “his” sense of humor but no matter how bad anyone would hurt him even a close friend; he would not say bad things about the person. He was very forgiving in some ways. He handled situations much better then I would have. Some of the situations I’ve seen him in I would have decked the guy and landed him in the hospital (not Jay, the other guy).


    Maybe that was it, he held it in. He held it in and it cracked his heart. The doctors said he had a lot of damage. It makes me wonder if all those times he held it in caused the damage.


    There was a lot to Jay. He loved his family, although he never married, or had kids, he was very dedicated to his family. He was also a great uncle that you wish you had when you were growing up. He spent a great deal of time and effort to bring his family to the United States, sometimes one at a time. He would take his nephew’s to NYC for their 21st birthday and tried to make it a memorable occasion.


    Even with everything going on at work and all the stress he decided to still go to a wedding in his family up north. He had started to have second thoughts about the trip and I remember telling him what he told me once. Family is all you have in this world.


    He did everything he could to help people that he knew. He kept things going and gave guidance even through turbulent years of company financial woes and changes. He had vision and knew something special when he saw it. He was a very smart man and could just blow your mind away, especially in physics (he was physics major at FSU, many cycles back).


    He will be missed by everyone that ever had the privilege to know him and spend time with him.


    To Jay Karnik


    May we see him again, and we wish him a wonderful journey


    Now I begin a new journey myself and seek new opportunities.

  • Converting Access Tables to SQL Linked Tables

    I’ve gone to the ends of the earth looking around for how to just get an existing Access application working with the data via SQL Server (or any other for that mater). 

    Most of what I found talked about migrating the Access format application to Access Projects format, this just involved creating a new project, setting it to an existing/new connection (to SQL Server) and then importing in everything you want from the Access format application.


    Problem was that it did not import or convert queries, queries are a real pain. The only thing worse then Access queries in my opinion are queries with parameters. Talk about a nightmare, 100 or so queries, some with parameters, many using VBA functions. This was no small task. Then there was the application, everywhere it used “OpenQuery”, or any other DAO type operation, had to be replaced. We were looking at a lot of work.


    Upsizing to the rescue, I’m sure there are some articles out there, but I’ve not found one. Most seem to have the mind set of migrating totally out of access, which isn’t what I want. We wanted simply to have the tables in SQL server and leave everything else alone (why break what works), later we are just going to build a new UI and new structure and just keep the two updated via triggers, but that’s another article.


    First thing to do, you’re here at your Tables listing in Access, you need to make sure of a few things first:


    Every table must have a primary key (or access will treat them as not updateable).




    Now we go to Tools/Database Utilities/Upsizing Wizard.


    First thing here is we want to set it to use an existing database (I’ve had errors using Create new database, so I would advise you pre-create your database).


    Click Next you will get:



    Now if you don’t have a data source already, then hit new, otherwise select the one you have and hit OK (we will not go though the DSN creation, it’s pretty routine).


    Now we have this screen:



    Select all the tables you would like to move over Next:



    I’ve checked (Indexes, Validation rules, Defaults, Table relationships (Use DRI)). If you are only interesting in structure then click the “Only create the table structure” option.


    Now click next:


    This is very important, in order to upsize and then replaces existing tables; you must select the “Link SQL Server tables to existing Application” option.


    Now click Finished.


    That’s all there was to it, you will get a report of any tables you forgot to make sure had primary keys, along with any other errors it might have had.


    Now your access application will be able to work just as it always has. Later you can migrate it, but not if you don’t have to.


    The main issue we have with this method of migrating is speed, since they are linked tables, you have a speed issue, how bad is based on your data and queries.

  • Persisting Custom Collections of a Custom Control

    Well, everyone, ready for CODE CAMP ORLANDO!!!!!!!!!!! YEA!!!!!!!!!!!


    The past few days I have not been having much fun. Not too long ago I had dived back into ASP.NET development and being dissatisfied with such weak offerings I had decided to take charge.


    What do I mean take charge? Well it’s simple; I personally think web applications have it all wrong.


    I think it’s very possible to build a data bound form and have state; it’s all in how you go about doing it.


    So for my first piece in fixing this mindset, I decided to build a data source control. I wouldn’t have gone this route if it wasn’t for the fact that in 2005, I was forced into a different way of doing web applications. All without any choice, maybe I’m wrong (I hadn’t found any way), but there is no way to say (keep it 2003 mode, even when picking the general developer 2003 profile). I also don’t like the idea of non UI elements all over my design surface. I don’t like the whole everything is a control. I know why it was done this way, after all by default the design surface only interacts with controls (not components) and hence in order to do the (all markup (for whatever reason)) markup (instead of code behind) you would need to only work with controls. So in that process, lets just take away every ones components they worked long and hard and that work nicely with the at design time. That’s right, no components. You can’t even add them to the tool box, only controls can be added when you are working on a web project.


    Markup, markup, markup, why is everyone so nuts about markup, me personally if the markup doesn’t have a direct effect on the UI then I don’t want it there. It should be in the code behind where it belongs, I can see UI related bits being in markup that I get.


    Another thing bothering me is that since my DAL/BL’s are all component based, I can’t bind to them in 2005. I have to do a data source control (or use that limited Object data source control).  So sounds simple, I will just make a Data souce control to consume my DAL/BL’s, simple right? WRONG!! It was a nightmare, granted building one was a piece of cake, but getting it to work with the design time surface was a complete nightmare. Let’s say there was limited documentation and what I did find, the samples didn’t work.


    Yes, all I wanted was a simple data source control that consumes my DAL/BL’s, I did that, I even added the ability for it to search though (on click of ellipse in property browser) to list all the classes from each assembly that match the IBL interface I built. All cool and dandy, it creates an instance when that property is populated, as well as populates the “GetViewNames” method. Well now you would think it would be simple, I just drag a datagridview to the form, set the datasource and then hit the datamember and boom select the data table desired. WRONG!!!!!!!!! Nope, it seems that by default there are no base designers for the DatasourceControl class. Now you would think with the GetView and GetViewNames that boom the datagridview would just have some attribute that would call the datasource’s getnames and we would be done right? Nope, custom designer.


    It doesn’t end there I have to do a custom datasourceview designer a schema designer and a field designer, all so that the design time environment will work. Let me say this, I personally think at this point that some of the ASP.NET 2.0 bits are a bit over engineered. It’s like going from one extreme to another, all the enterprise guys demand business objects blah blah blah, and 1.0 – 1.1 was more STD friendly, now it’s the other way around, don’t ask me why.


    My latest battle was something very simple, and since this is fresh in my head I will share it. Take a look at this code.


    <System.Web.UI.ToolboxData("<{0}:TestControl runat=""server"">,</{0}:TestControl>"), Web.UI.ParseChildren(True), Web.UI.PersistChildren(False)> _

    Public Class TestControl

        Inherits System.Web.UI.Control


        Private _Col As New ComponentModel.BindingList(Of TestThis)


        <ComponentModel.DesignerSerializationVisibility(ComponentModel.DesignerSerializationVisibility.Visible), Web.UI.PersistenceMode(Web.UI.PersistenceMode.InnerProperty)> _

        Public ReadOnly Property MyCollectionTest() As ComponentModel.BindingList(Of TestThis)


                Return Me._Col

            End Get

            'Set(ByVal value As ComponentModel.BindingList(Of TestThis))

            '    Me._Col = value

            'End Set

        End Property

    End Class


    Public Class TestThis


        Private Val As String


        Public Property MyString() As String


                Return Val

            End Get

            Set(ByVal value As String)

                Me.Val = value

            End Set

        End Property

    End Class


    Looks rather simple doesn’t it? Well believe it or not in winforms it’s just one attribute to have a collection persist to code (ComponentModel.DesignerSerializationVisibility(ComponentModel.DesignerSerializationVisibility.Content))


    Yep, I call one little attribute and it all works, funny thing is this line worked in 1.0 and 1.1 for ASP.NET as well. Here comes 2.0 and the Markup deal, now it doesn’t work, so I’m reduced to finding out why. I had looked everywhere, I was searching for all the wrong things. I finally decided to search for custom controls and samples, I looked at several custom controls people did and wrote articles on. I finally found one with a collection, it didn’t work, I found another, it didn’t work (in 2.0 mind you for both). I finally put pieces together from each and boom the code above is the result.


    Note this part:

      'Set(ByVal value As ComponentModel.BindingList(Of TestThis))

            '    Me._Col = value

            'End Set


    Well it seems that I spent quite a while trying to get it to work (with the attributes we will disusse in just a minute) and I could get it to persist to markup, but I couldn’t get the control (when I opened the form in Design view) to read the collection. It would just error out. Finally I said what the hell, it looks like maybe it’s trying to create a new MyCollection object, so I just commented out the Set assesor and guess what? It worked just fine and everyhting was marry with the world. Pretty messed up huh? I understand why, but at the same time I don’t think it should be like that. Anyways, lets talk about what got me to persist the items that gave me the break I had been searching for.


    What was the break though you might ask? Well it’s a couple of things really, note these attributes on the property.


    <ComponentModel.DesignerSerializationVisibility(ComponentModel.DesignerSerializationVisibility.Visible), Web.UI.PersistenceMode(Web.UI.PersistenceMode.InnerProperty)>


    The DesignerSerializationVisiblity attribute tells the designer host that this is a property to serialize (based on the setting). The PersistenceMode attribute tells the designer host how you want to persist the item (in this case I want it as a


    Now a couple on the custom control class:


    <System.Web.UI.ToolboxData("<{0}:TestControl runat=""server"">,</{0}:TestControl>"), Web.UI.ParseChildren(True), Web.UI.PersistChildren(False)> _


    Now the first one (ToolboxData) is pretty comon, no explaining that one. The other two aren’t, lets examine Parsechildren, this says to parse all the children on load up (children are like collection items for all the collections, note you can set it to work for one collection). The PersistChildren says (when set to false) so that none of the controls nested controls are persisted. Set this to true and nothing works right (for the collection that is).


    That’s all there was to it, pretty simple huh, I can’t believe it beat like it did. I just couldn’t assiocate the need for all these other attributes. All to get one little ole collection to do this:


            <cc1:TestControl ID="TestControl1" runat="server">


                    <cc1:TestThis />

                    <cc1:TestThis />

                    <cc1:TestThis />

                    <cc1:TestThis />




    Pretty darn simple when you look at it. Such a pain and so much guess work.


    Anyways, I plan to write a nice big fat article and present this as a persentation, my new Strongly Typed Data Access Layer for 2.0 (support for both winforms/webforms) presntation.


    Yes it’s going to be much differenet then my previous talk was, though that was based on 1.1 of the framework. So keep checking in, I hope to have it soon.



More Posts Next page »