Apple commercial versus Verizon’s commercials

Posted by Bryon on June 19th, 2011

verizon1Apple has released dozens of iPhone and iPad commercials.  Apple’s commercials are great – they highlight the things you can do with the technology.  How you can use, benefit, and enjoy the technology in many different ways.  On the other hand, Verizon’s commercial highlight the technology specifications.  They focus on the chipset, the memory, the radio technologies…

Apple/iPad: “When technology gets out of the way everything becomes more delightful.”

Verizon/Android: “Your wife will love the dual core Tegra 2 chipset.”

Apple/iPad: “If you ask a parent, they might call it intuitive.”

Verizon/Android: “4G LTE upgradeable.”

Really Verizon?  Have you not learned?  Who the hell knows what a dual core Tegra 2 chipset is?  While I do, only because I live and breath technology, most people don’t.  It’s no way to sell.  Apple has sold tens of millions of iPads – and the competitors will flounder even if they have a better product because they don’t know how to market it.

TFS Work Items–How long has defect been in current status?

Posted by Bryon on December 14th, 2010

An important part of managing the software development lifecycle is tracking the status of your defects, who they are assigned to, and for how long has it been assigned to that individual.  Ideally, you want the number of days metric to be small.  The query below will provide you with this data.  You can easily identify who is slow resolving their defects, and who is doing a great job turning around their assigned work.  Please note that this query is based off the TFSWarehouse database – not the cube.  If you have an easy way of doing this same thing using the cube, let me know.  This is the most efficient way I found.

select max(__lastupdatedtime) maxdate, system_id as sysID
    into #temptable
    from [work item]   
    group by system_id

select distinct(system_ID) as WorkItemID, System_Title as Title,
    Person.Person as AssignedToName, currw.System_CreatedDate as CreateDate, [Previous State] as PreviousStatus,
    System_State as CurrentStatus, System_Reason as Reason, Priority, 
    EnvironmentFoundIn as Environment, Blocked,
    w.__lastupdatedtime as DateUpdated, DateDiff(d,w.__lastupdatedtime, GetDate()) as DaysInStatus,
    DateDiff(d,currw.system_createddate, GetDate()) as DaysOpen
from [work item] w, #temptable t, [current work item] currw, person
where t.maxdate = w.__lastupdatedtime
And t.sysID = w.system_id
And currw.__TrackingId = w.system_id
And person.__id = currw.[assigned to]
And System_State <> ‘Closed’
And currw.[team project] = 1
Order By DaysInStatus DESC

drop table #temptable

Ideally, you can wrap this into a stored proc and use SSRS to present this data to end users and management.

Mass updating work items in TFS

Posted by Bryon on December 13th, 2010

Need a quick way to update some work items matching a specific query?  Use Power Tools to select the work items and then pipe the results into a second command to update them.  Here is the syntax:

tfpt query /format:id “TeamProject\My Queries\Query1” /server:http://tfsserver:8080 |
    tfpt workitem /update @ /fields:”Assigned To=bbrewer” /server:http://tfsserver:8080

Pretty quick and easy way.  It will display each work item ID in the command window that it updates.  Even nicer, it will give you a summary at the end with a list of work item IDs it was unable to update (if any – which could be because of validation rules on the work items).

ASP.NET 4.0 Threading – High speed file crawler

Posted by Bryon on August 18th, 2010

Multi-threaded applications have traditional been complex to build.  However, ASP.NET 4.0 framework has greatly simplified the process of creating threaded applications.  Given that most machines today are using multiple processor cores, it is especially important to start writing applications that take full advantage of the processor power.  The example below uses the “System.Threading.Tasks.Parallel” class to report on file usage across a file system.  This was written for a client who wanted to report on several pieces of information related to file shares allocated to company departments:

  1. Number of files and kilobytes used per file owner (who is using the same)
  2. Number of files per file extension and kilobytes (useful to determine how many audio files, video files, Office documents, etc…)
  3. Age of files grouped into several buckets (0 to 6 months, 6 to 12 months, 12 to 24 months, and over 24 months)

The first method, “CrawlDirectories”, accepts the root directory you want to start crawling, any directories you want to exclude, and if you want to retrieve file owner information for each file.

private void CrawlDirectories(string RootDirectory, bool getFileOwners)
{
    // get a directory object
    try
    {
        DirectoryInfo di = new DirectoryInfo(RootDirectory);

        // loop through directories and look at files
        try
        {
            DirectoryInfo[] myDi = di.GetDirectories();

            Parallel.ForEach(myDi, folder =>
            { 
                    //look at files
                    AddFiles(folder.GetFiles(), getFileOwners);

                    //see if subdirectories to crawl
                    CrawlDirectories(folder.FullName, excludeDirectory, getFileOwners); 
            });
        }

        catch (Exception ex)
        {
            myLog("Error getting subdirectories: " + di.FullName + "; " + ex.Message, true);
        }
    }
    catch (Exception ex)
    {
        myLog("Error getting directory Info: " + RootDirectory + "; error: " + ex.Message, true);
    }
}

Notice the For loop using the new “Parallel” class above (in red).  This defines a parallel loop.  The application will spin up multiple threads to crawl through the directories.  Within this loop, we are calling the “Addfiles” method which loop through the files within the current directory, and then it calls itself to recursively crawl any subdirectories within the current directory.  Monitoring my machine resources, I can see on average 70 threads spin up to crawl the directories.  All of these is handled automatically by the framework!

The next method, “AddFiles”, will use a parallel loop to report on the files within the directories.  When you are using arrays or other types of collections in a multi-threaded application, you must use thread-safe collections!  I have chosen to use “System.Collections.Concurrent.ConcurrentDictionary” to store the file data I’m collecting.  This is basically a key, value pair that is thread-safe and can be accessed by multiple threads concurrently.

private void AddFiles(FileInfo[] filesInDir, bool getFileOwners)
{
    string ownerName = "";
    string fileExtension = "";
    Parallel.ForEach(filesInDir, fi =>
    {
        try
        {
            //calculate # of files and bytes per file owner                  
            if (getFileOwners)
            {
                IdentityReference NTAccountName = fi.GetAccessControl(AccessControlSections.Owner).GetOwner(Type.GetType("System.Security.Principal.NTAccount"));
                ownerName = NTAccountName.Value.ToUpper();
                ownerBytesDC.AddOrUpdate(ownerName, fi.Length, (key, old) => old + fi.Length);
                ownerCountDC.AddOrUpdate(ownerName, 1, (key, old) => old + 1);
            }

            //calculate file extension # of files and bytes
            fileExtension = fi.Extension.Replace(".", "").Replace(",", "").ToUpper();
            extensionBytesDC.AddOrUpdate(fileExtension, fi.Length, (key, old) => old + fi.Length);
            extensionCountDC.AddOrUpdate(fileExtension, 1, (key, old) => old + 1);

            //calculate ages of files
            if (fi.LastAccessTime >= DateTime.Now.AddMonths(-6))
            {
                System.Threading.Interlocked.Add(ref TotalByte0to6mo, fi.Length);
                System.Threading.Interlocked.Add(ref NumberOfFiles0to6mo, 1);
            }
            else if (fi.LastAccessTime < DateTime.Now.AddMonths(-6) && fi.LastAccessTime >= DateTime.Now.AddMonths(-12))
            {
                System.Threading.Interlocked.Add(ref TotalByte6to12mo, fi.Length);
                System.Threading.Interlocked.Add(ref NumberOfFiles6to12mo, 1);
            }
            else if (fi.LastAccessTime < DateTime.Now.AddMonths(-12) && fi.LastAccessTime >= DateTime.Now.AddMonths(-24))
            {
                System.Threading.Interlocked.Add(ref TotalByte12to24mo, fi.Length);
                System.Threading.Interlocked.Add(ref NumberOfFilee12to24mo, 1);

            }
            else if (fi.LastAccessTime < DateTime.Now.AddMonths(-24))
            {
                System.Threading.Interlocked.Add(ref TotalByteOver24mo, fi.Length);
                System.Threading.Interlocked.Add(ref NumberOfFilesOver24mo, 1);
            }
        }
        catch (Exception ex)
        {
            myLog("Cannot report on file: " + fi.FullName + "; error: " + ex.Message, true);
        }
    });
}

We end up with several key, value pair collections:

  1. ownerBytesDC and ownerCountDC.  These collections both have a key of the user name, and the value of the kilobytes total for that user and the count of files. 
  2. extensionBytesDC and extensionCountDC.  Key is the file extension, with values of total bytes and file count.
  3. The remaining data is stored in Int64 variables for the age of files in each age bucket.

ASP.NET 4.0 framework provides a great way to update a collection in thread-safe manner.  The collection has a method “AddOrUpdate()” that will add the key if it doesn’t exist in collection, or update it if it’s already there.

You may be wondering why we have the “getFileOwners” boolean parameter on the method.  Getting the file owner of a file requires a call to the “GetAccessControl” method – which is part of the file security methods within Windows.  This operation is expensive!  It slows down the crawl by several-fold.  Thus, if you don’t require this information, you can turn it off.

This application can crawl through about two million files and three terabytes of data per hour (tested on a two-proc VMware machine).  More robust hardware could speed this up.

Easy multi-threaded applications – finally!

Solution: VMWare Guest Shutdowns very slow

Posted by Bryon on August 2nd, 2010

In certain scenarios, you may find that shutting down a VM is very slow for no apparent reason.  I found this to be true under the condition I’m running:

  1. Windows 7 64-Bit
  2. Utilizing an external USB drive to host the guest VHD
  3. 8GB RAM
  4. Using a Dell laptop (tried on a D630 and Precision M4500)

To fix this issue, navigate to the following file and add these settings:

C:\ProgramData\VMware\VMware Workstation\config.ini AND C:\ProgramData\VMware\VMware Player

prefvmx.minVmMemPct = "100"
mainMem.useNamedFile = "FALSE"
mainMem.partialLazySave = "FALSE"
mainMem.partialLazyRestore = "FALSE"

You should notice immediate improvement in shutdown speed.

Custom Sorting for Work Items in Team Foundation Server 2008

Posted by Bryon on September 23rd, 2009

TFS provides the WIQ Language (Work Item Query) to query work items.  WIQ is very similar to standard SQL — you have a Select statement to pull back columns of data, a Where clause to filter data, and a Sort By clause.  However, WIQ is much more limited than standard SQL. In particular, advanced sorting logic is not supported.  Thus, you may find yourself needing to implement some type of custom solution to sort your work items.  A recent client had a pretty advanced sorting system in place in their old Mercury TestDirector system (they migrated over to TFS and got rid of TestDirector).  You might need custom sorting if, for example, you need special logic to determine the priority of the defects/bugs.  Developers will work on the queue of defects based on the order of the work items.  Here are your options:

  1. Use WIQ.  As stated above, this option is limited to basic sorting.  You can sort by a single field, or multiple fields.  Conditional logic or advanced grouping is not supported.
  2. Develop a web service triggered by the Work Item Changed event.  By adding a field to the work item definition (perhaps an integer field labeled "SortOrder") you can evaluate the work item to determine what sort order it should be assigned.  In our case, we identified 5 sort levels, so the "SortOrder" field was assigned a number between 1 and 5.  Work Item queries were written to sort based on this field first, then by other fields to secondary sort the work items.

    The web service gets run after you save the work item.  The web service event fires, evaluates the work item, assigns a value to the "SortOrder" field, and then saves the work item.

    Sample code (this uses the TFSEvents framework available at Codeplex):

    public void Notify(string eventXml, string tfsIdentityXml)
        {
            WorkItemChangedEvent workItemChangedEvent = this.CreateInstance<WorkItemChangedEvent>(eventXml);
            TFSIdentity tfsIdentity = this.CreateInstance<TFSIdentity>(tfsIdentityXml);

            int WorkItemID = workItemChangedEvent.CoreFields.IntegerFields[0].NewValue;
            Microsoft.TeamFoundation.Client.TeamFoundationServer TFS = new TeamFoundationServer(tfsIdentity.Url);
            TFS.Authenticate();

            WorkItemStore WIS = TFS.GetService(typeof(WorkItemStore)) as WorkItemStore;
            WorkItem updatedWI = WIS.GetWorkItem(WorkItemID);
            string priority = "";
            string environment = "";

            if (updatedWI.Fields.Contains("CustomFields.SortOrder"))
            {
                priority = updatedWI.Fields["CustomFields.Priority"].Value.ToString();
                environment = updatedWI.Fields["CustomFields.FoundInEnvironment"].Value.ToString();

                if (priority == "1-Now")
                {
                    updatedWI.Fields["CustomFields.SortOrder"].Value = 1;

                }
                else if (priority == "2-ASAP")
                {
                    updatedWI.Fields["CustomFields.SortOrder"].Value = 2;
                }
                else if (updatedWI.Fields["CustomFields.CRID"].Value.ToString() != "" ||
                    (updatedWI.Fields["System.IterationID"].Value.ToString() != "154" &&
                    updatedWI.Fields["System.IterationID"].Value.ToString() != "314"))
                {
                    updatedWI.Fields["CustomFields.SortOrder"].Value = 3;
                }
                else if (environment == "2-PRE-PROD" || environment == "3-Test" || environment == "4-DEVL")
                {
                    updatedWI.Fields["CustomFields.SortOrder"].Value = 4;
                }
                else if (environment == "1-PROD")
                {
                    updatedWI.Fields["CustomFields.SortOrder"].Value = 5;
                }

                if (updatedWI.IsDirty)
                {
                    updatedWI.Save();
                }
            }
        }

    A big disadvantage to this method — the work item gets updated by the web service, thus if a user has the work item loaded on their screen, makes a change, and tried to save, it will fail and notify the user the work item has been updated by another user.  This scenario is very likely to happen since the user will have the work item on their screen during the initial save and may want to add additional data or update information on the work item.  They are forced to close and reopen the work item after each save.  For most organizations, this would be frustrating and probably disqualify this solution.

  3. Develop a client-side custom control.  This option is similar to the web service, but the logic lives on the client.  Using a custom field such as "SortOrder" you can assign a sort value based on the work item information.  The logic runs before the work item is saved, so only one update/save operation is required.  The advantage to this method is you don’t have the update problem as you do with the web service.  The disadvantage is that this control needs to be deployed to all your developer/client machines or else the sort logic won’t run and be assigned.

Based on all these options, the best option if you require advanced sorting logic is to develop a client-side custom control.

SharePoint Designer 2007

Posted by Bryon on April 6th, 2009

SharePoint Designer 2007 is now free!  Check it out – http://blogs.msdn.com/sharepoint/archive/2009/04/02/sharepoint-designer-available-as-a-free-download.aspx

Dallas VSTS User Group

Posted by Bryon on November 21st, 2008

So I went to the Visual Studio Team System user group meeting a couple nights ago here in Dallas.  It’s actually really great to have a group of smart people here in Dallas that have a lot of expertise and passion about VSTS / Team Foundation Server.  If you are interested in learning more about VSTS, visit http://www.dallasvsts.com/ for meeting times/locations.  The group meetings every month.

Ed Blankenship (VSTS Microsoft MVP) did a presentation on the new October release of Team Foundation Server 2008 Power Tools and the new features coming in VSTS 2010.  The new testing tools coming out with VSTS are really amazing — it’s a huge focus for this release.  VSTS testing tools have always lacked in the UI functional testing area.  The new release includes a tool called Camano (codename) that is specifically designed for testers to easily record UI tests, reports bugs, and save later for regression testing.

Using a Friendly URL for Team Foundation Server

Posted by Bryon on November 21st, 2008

It is a pretty common need to change the URL of your Team Foundation 2008 Server.  For example, you may want to create an alias or other easy-to-remember URL for your team to access TFS.  TFS 2008 has made this task much easier.  To accomplish:

  1. On your application tier server, run the following command:
    TFSAdminUtil ActivateAT tfs.mycompany.com
  2. On the application tier server, modify the registry (run regedit.exe):
    Path: HKLM\Software\Microsoft\VisualStudio\9.0\TeamFoundation\ReportServer\80\Sites
    Registry keys: Change BaseReportURL and ReportService keys to the new URL to use.
  3. Finally, you need to setup SharePoint to use the new Url.  In SharePoint Central Administration, under Operations, click “Alternate Access Mappings”
  4. Add the new URL.  You can add an internal URL for the Default zone to apply across all zone.  You can also edit the public URL to respond to the new URL.

Based on my experience, this will update the URL in all places it needs to be updated.

ITIL Certified!

Posted by Bryon on October 4th, 2008

itil_logo_2 I passed my ITIL certification test today!  So I am officially ITIL certified.  I had to take the test at Prometric in Dallas — it’s the foundations test.  It’s always a little nerve-racking taking a certification test, but I’m glad I passed.

ITIL is a framework for IT service delivery.  I really learned alot during the course of studying the material.  From an IT Manager’s perspective, it is invaluable in understanding all the concepts of running an IT department, creating a customer-focused team, and delivering top-quality IT services to your customers.  It covers topics such as how to design, setup, and run a service desk, incident and problem management, configuration management, release management, and creating service level agreements (SLAs) with your customers.  It also covers some infrastructure related topics such as capacity management, availability management, and business continuity planning. 

In my years in the IT industry, I can certainly see how this framework would benefit many IT organizations.  You really need a framework such as this to help you formalize the processes around all the aspects of delivering IT services.  So many organizations stumble on one or more of the topics mentioned above.  All of these are important to delivering IT services successfully.  I also like how the framework is very customer-oriented and focused.  The framework helps IT to align their goals to the business goals.

To prepare for certification, I used the following resources:

  1. InteQ ITIL training — this is a web-based study program that contains 11 or 12 modules that you work through to cover all the ITIL processes.  It’s good material — and has audio to walk you through each topic.  EMC paid for this course.
  2. ITIL Prime – they provide practice exams.  These are great to prepare you for the certification exam and see where you are.  I think the tests were a little harder than the actual exam, so if you can pass these, you’ll probably do okay.
  3. “Foundations of IT Service Management” – this book, available on Amazon, provided a large part of the study material.  Book is good and covers everything you need to know.
  4. Lastly, there is a wall chart that maps out all the ITIL processes.  This was very helpful as a quick reference!  I can see myself using this in the future — I highly recommend it. 

Copyright © 2010 Bryon Brewer (Owner, Project Consults, Inc.)