Category Archives: Programming

Microsoft AJAX, Business Objects at Client Side–Part 1

When I started working on the ASP.NET, its all about the server side coding and database connectivity. JavaScript is more like a validation tool, this is the mind-set that i had. But recent days in my new company changed my mind-set. Now I am exploring the JavaScript into very much details. I am very much delighted to have such a type of colleagues who helped me to change this mind-set. During this time I experimented with many thing and one of them is effective use of Microsoft AJAX.

Most of the “ASP.NET” programmers has this habit of using UpdatePanels, and ScriptManagers to work with Microsoft AJAX and stops there. But few people go beyond that. That is where actual beauty begins. Now i just wanted to share them with you.

My goal here is to access or use the server side business objects in the java script with the same signature. The advantage is here is, i don’t need to worry about the data types and JSON string. For me it is just like another server side function. To achieve this many programmer use many methods offered by the Microsoft AJAX framework.

In my case I used WCF services to expose my business logic. One can use normal web service for this. So what are the tools I require to build this?

  1. AJAX Enabled WCF Service.
  2. Script Manager (Microsoft AJAX)

Let us see these points in detail,

1. AJAX Enabled WCF Service:

In my project, I have my business logic in one business layer. Since it is already there, I don’t want to change them. So i created one WCF service in which I written the proxy classes to exactly match with my business class.

Here I am using the default SQL Database, “AdventureWorks” in that three tables Employee, Departments, EmployeeDepartmentHistory and EmployeeAddress. My dbml is as follows,

image

Now I need to create a WCF service to access these tables.

image

In my case, service name is Employee.svc. the following is the default code that visual studio generates,

using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.Serialization;
using System.ServiceModel;
using System.ServiceModel.Activation;
using System.ServiceModel.Web;
using System.Text;

namespace ClientSideBusinessObjects.WCFService
{
	[ServiceContract(Namespace = "")]
	[AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)]
	public class Employee
	{
		// To use HTTP GET, add [WebGet] attribute. (Default ResponseFormat is WebMessageFormat.Json)
		// To create an operation that returns XML,
		// add [WebGet(ResponseFormat=WebMessageFormat.Xml)],
		// and include the following line in the operation body:
		// WebOperationContext.Current.OutgoingResponse.ContentType = "text/xml";
		[OperationContract]
		public void DoWork()
		{
			// Add your operation implementation here
			return;
		}

		// Add more operations here and mark them with [OperationContract]
	}
}

Here “Namespace” in the ServiceContract or DataContract will tell us, under which name these classes should group together. In my case it is “BusinessObjects”. Right after that, we will observe AspNetCompatiblityRequirements, this is required and makes service available in the client side. All our methods, if need to available in client side then must be attributed by [OperationContract].

2. ScriptManager (Microsoft AJAX):

Script manager is the one who brings our web service to the JavaScript. Once we place the script manager onto the page, it is time to tell the script manage about our web service. To cater this, script manger comes with a property called “Services”. Under this we can assign our web service using ServiceReference. The following is the typical code to achieve the above,

<asp:ScriptManager ID="ScriptManager1" runat="server">
	<Services>
		<asp:ServiceReference Path="~/WCFService/Employee.svc" />
	</Services>
</asp:ScriptManager>

Now this setup brings the web service into the client side. Before going further, let us test what we have done. To test, I written one method in my service which is as follows,

public string WriteData()
{
	return DateTime.Now.ToString();
}

Now let us test the setup. To test I have taken a simple ASPx page named, Test.aspx and placed my script manager into that. Now placed one label (ASP.NET) and a Button (this is too). Considering my application is referenced to jQuery, I written the following code in my script block,

$(document).ready(function () {
	BusinessObjects.Employee.WriteData(function (result, e) {
		$get("<%=lblDate.ClientID %>").innerHTML = result;
	}, function (result, e) { alert(result.get_message()); });
});

In this code, we can observe how we are calling server side methods. The common syntax will be as follows;

<<NameSpace>>.<<ClassName>>.<MethodName>>(<<Parameters>>, OnSuccess(result, e), OnFailure(result, e));

Here the result in OnSuccess will hold the returned data from web service method in JSON format. If there is any exception while executing the method, then the exception will be captured in the “result” of the OnFailure. To get actual exception details use the member method “get_message()”.

Now this script is going to execute, once the document completed loading and ready for user input. After this for the button, I added the following javascript for the OnClientClick event,

function Update() {
	BusinessObjects.Employee.WriteData(function (result, e) {
	$get(&quot;&lt;%=lblDate.ClientID %&gt;&quot;).innerHTML = result;
	}, function (result, e) { alert(result.get_message()); });
return false;
}

Since it is returning “false” at the end of execution, the click event will not fire for post back. Now run the code and test, the following is the result.

image

Click on the button update to see the updated time. My complete aspx page looks like;

 &lt;%@ Page Language=&quot;C#&quot; AutoEventWireup=&quot;true&quot; CodeBehind=&quot;Test.aspx.cs&quot; Inherits=&quot;ClientSideBusinessObjects.Test&quot; %&gt;

 &lt;!DOCTYPE html PUBLIC &quot;-//W3C//DTD XHTML 1.0 Transitional//EN&quot; &quot;http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd&quot;&gt;

 &lt;html xmlns=&quot;http://www.w3.org/1999/xhtml&quot;&gt;
 &lt;head runat=&quot;server&quot;&gt;
     &lt;title&gt;&lt;/title&gt;
     &lt;script src=&quot;Scripts/jquery-1.4.1.min.js&quot; type=&quot;text/javascript&quot;&gt;&lt;/script&gt;
 &lt;/head&gt;
 &lt;body&gt;
     &lt;form id=&quot;form1&quot; runat=&quot;server&quot;&gt;
     &lt;asp:ScriptManager ID=&quot;ScriptManager1&quot; runat=&quot;server&quot;&gt;
     &lt;Services&gt;
         &lt;asp:ServiceReference Path=&quot;~/WCFService/Employee.svc&quot; /&gt;
     &lt;/Services&gt;
     &lt;/asp:ScriptManager&gt;
     &lt;div&gt;CurrentTime:&amp;nbsp;&amp;nbsp;
         &lt;asp:Label ID=&quot;lblDate&quot; runat=&quot;server&quot; Text=&quot;&quot;&gt;&lt;/asp:Label&gt;&lt;br /&gt;
         &lt;asp:Button ID=&quot;Button1&quot; runat=&quot;server&quot; Text=&quot;Update&quot; OnClientClick=&quot;javaScript:return Update()&quot; /&gt;
     &lt;/div&gt;
     &lt;/form&gt;
     &lt;script type=&quot;text/javascript&quot;&gt;
         $(document).ready(function () {
             BusinessObjects.Employee.WriteData(function (result, e) {
                 $get(&quot;&lt;%=lblDate.ClientID %&gt;&quot;).innerHTML = result;
            }, function (result, e) { alert(result.get_message()); });
         });

         function Update() {
             BusinessObjects.Employee.WriteData(function (result, e) {
                 $get(&quot;&lt;%=lblDate.ClientID %&gt;&quot;).innerHTML = result;
             }, function (result, e) { alert(result.get_message()); });
             return false;
         }
	&lt;/script&gt;
 &lt;/body&gt;
 &lt;/html&gt;

In part 2, we will discuss on business objects and accessing them in client side.

PowerShell, Working with MS Access Database

This is my second post on PowerShell, continuing my earlier post. The more I am working with PS, more i am getting the interest. For an administrator and geeky programmer the possibilities are endless. Today my aim is to connect to MS Access, read the table and create another table with unique records. Initially i thought to use SQL in MS Access to remove the duplicated by grouping them and taking top 1. But the issue is, i can get unique records only for “Doc Number” and “Doc Revision” columns of the table (check my  previous post). As soon as i include all other columns i could not get unique records. Now, i left with one option that is write C# code to do that job. But i want to experiment with PowerShell to do that job.

Once you install PowerShell, you get two application;

  1. PowerShell
  2. PowerShell ISE (Integrated Scripting Environment)

ISE looks like below and this is where you can write shell script for your needs. Of course, you can use any editor for that matter to write script. But using ISE give you syntax highlighting, intellisence, Commandlets help and also a shell to test your script.

image

As usual, I divided my whole problem into small milestones to achieve progressively. The following are my milestones to meet;

  1. Connect the MS Access MDB file through OleDB
  2. Create a dataset with two tables, one with schema (DT1) and another one with data (DT0).
  3. Go through each row and check if DT1 has the record or not and if not then insert the current record
  4. Exporting DT1 to excel so that, i can use it further

With this milestones, I pulled up my ISE and started scripting for first milestone.

Connecting to MDB file:

To achieve this, i need three variables to store 1. path to MDB file, 2. store the SQL command and 3. to store the connection string. In PowerShell a variable is denoted with “$”, check the following code;

$dataSource = "C:\Users\pradeepy\Documents\uote_docs.mdb"
$selectQuery = "SELECT * FROM FULL_DOCS_INFO"
$dsn = "Provider=Microsoft.Jet.OLEDB.4.0; Data Source=$dataSource"

Here, “$dataSource”, “$selectQuery” and “$dsn” are variables that i am going to use. Since my variables are ready, now i need to connect to the database. The following code will do the job;

Write-Host "Creating connection to database" -ForegroundColor Cyan
$dbCon = New-Object System.Data.OleDb.OleDbConnection $dsn
$sqlCmd = New-Object System.Data.OleDb.OleDbCommand $selectQuery, $dbCon
$dbCon.Open()

In this code, “Write-Host” is a commandlet that we can use to print some thing onto the screen. We can also set the foreground and background colours to the text. In this case, i added “Cyan” colour to the text. Now i should able to run the code and I ran the code to only hit back by an exception;

image

It seems, I cannot access OleDB provider for MS Access. Immediately I checked my ODBC driver by firing the command Get-OdbcDriver. This list all the drivers installed  and found my Access Drivers. That means it is not the issue with drivers but some thing else. A quick bing search revealed that, this is a typical issue if you run ISE in 64-Bit mode. Ok, lets close and open 32-Bit version of ISE and run the script. Now its working fine.

Note:

In some cases, system might throw an exception saying “cannot run the script due to script execution policy”. In this case you can bypass the policy by following commandlet;

Set-ExecutionPolicy -ExecutionPolicy Bypass

This will bypass the policy so that you can run the script. To set it back fire the above command again after replacing the “Bypass” to “Restrict”.

Creating the DataSet:

Now, second milestone. We need two datatables to handle the data. One is for data from MDB file and another one is unique records. Following code will do that;

Write-Host "Creating datasets" -ForegroundColor Cyan
$dataSet = New-Object System.Data.DataSet
$sourceDataTable = New-Object System.Data.DataTable "SOURCE_UOTE_DOCS"
$finalDataTable = New-Object System.Data.DataTable "FINAL_UOTE_DOCS"
$dataSet.Tables.Add($sourceDataTable)
$dataSet.Tables.Add($finalDataTable)

Once the datatables are ready, we need to pull the data into the DT0. Below code will do that;

Write-Host "Fill schema of the table into a data datable" -ForegroundColor Cyan
$dataAdaptor = New-Object System.Data.OleDb.OleDbDataAdapter $sqlCmd
$dataAdaptor.FillSchema($dataSet.Tables[1], [System.Data.SchemaType]::Mapped)
[void] $dataAdaptor.Fill($dataSet.Tables[0]);

Write-Host "Closing the connection" -ForegroundColor Cyan
$dbCon.Close();

Remove duplicate records:

It’s time to load only unique records. To do that, for each row in the DT0 we check if DT1 has any row with same Doc Number and REV. If not then put the row into DT1. In the following code, we are piping the DT0.Rows object to ForEach-Objectloop. In this loop, we try to select a records with current object’s DOCUMENT_NUMBER column and Doc Revision column form DT1. If the count is 0 then insert else just log it. Have a look at below code;

$duplicateRecordsCount = 0
$uniqRecordsCount = 0
$dataSet.Tables[0].Rows | ForEach-Object{
    If($dataSet.Tables[1].Select("[Doc Number] like '"+$_["Doc Number"]+"' and [Doc Revision] like '"+$_["Doc Revision]+"'").Count -eq 0){
        
        $uniqRecordsCount++
        $newRow = $dataSet.Tables[1].NewRow()
        $newRow[0] = $uniqRecordsCount ##  ID
        $newRow[1] = $_[1]             ##  Name
        $newRow[2] = $_[2]             ##  DirectoryName 
        $newRow[3] = $_[3]             ##  FullName
        $newRow[4] = $_[4]             ##  Doc Name
        $newRow[5] = $_[5]             ##  Doc Number
        $newRow[6] = $_[6]             ##  Doc Revision
        $newRow[7] = $_[7]             ##  PROC_CODE
        $newRow[8] = $_[8]             ##  SUB_PROC_CODE
        $newRow[9] = $_[9]             ##  DOC_REV
        $newRow[10] = $_[10]           ##  DOC_REV_EXT
        $newRow[11] = $_[11]           ##  EXT

        $dataSet.Tables[1].Rows.Add($newRow)

        $rowCmt = "New row added...["+$dataSet.Tables[1].Rows.Count+"]"

        Write-Host $rowCmt  -ForegroundColor Green
    }
    Else {
        $duplicateRecordsCount++
        Write-Host "Already exists..."  -ForegroundColor Red
    }
}

In the above statement, $_ represents the current object in the loop. PS don’t have any quality operator like “==” in C#, but it uses “-eq” for equality check. Once the loop reaches to end, we will have unique records in DT1.

Export to CSV:

Now our last milestone, pushing them into EXCEL sheet. Follow the below code;

Write-Host "Exporting to excel " -ForegroundColor Cyan
$dataSet.Tables[1] | Export-Csv c:\logs\new_uote_docs.csv

With this, I am able to create a CSV file to be used in my JAVA program to upload into Windchill.

The following is teh complete code that I used to do this.

$dataSource = "C:\Users\pradeepy\Documents\uote_docs.mdb"
$selectQuery = "SELECT * FROM FULL_DOCS_INFO"
$dsn = "Provider=Microsoft.Jet.OLEDB.4.0; Data Source=$dataSource;"

Write-Host "Creating connection to database" -ForegroundColor Cyan
$dbCon = New-Object System.Data.OleDb.OleDbConnection $dsn
$sqlCmd = New-Object System.Data.OleDb.OleDbCommand $selectQuery, $dbCon
$dbCon.Open()

Write-Host "Creating datasets" -ForegroundColor Cyan
$dataSet = New-Object System.Data.DataSet
$sourceDataTable = New-Object System.Data.DataTable "SOURCE_UOTE_DOCS"
$finalDataTable = New-Object System.Data.DataTable "FINAL_UOTE_DOCS"
$dataSet.Tables.Add($sourceDataTable)
$dataSet.Tables.Add($finalDataTable)

Write-Host "Fill schema of the table into a data datable" -ForegroundColor Cyan
$dataAdaptor = New-Object System.Data.OleDb.OleDbDataAdapter $sqlCmd
$dataAdaptor.FillSchema($dataSet.Tables[1], [System.Data.SchemaType]::Mapped)
[void] $dataAdaptor.Fill($dataSet.Tables[0]);

Write-Host "Closing the connection" -ForegroundColor Cyan
$dbCon.Close();

Write-Host "No. of rows in dataset >" -ForegroundColor Green
$dataSet.Tables[0].Rows.Count | Select-Object

$duplicateRecordsCount = 0
$uniqRecordsCount = 0
$dataSet.Tables[0].Rows | ForEach-Object{
    If($dataSet.Tables[1].Select("[Doc Number] like '"+$_["Doc Number"]+"' and [Doc Revision] like '"+$_["Doc Revision"]+"'").Count -eq 0){
        
        $uniqRecordsCount++
        $newRow = $dataSet.Tables[1].NewRow()
        $newRow[0] = $uniqRecordsCount ##  ID
        $newRow[1] = $_[1]             ##  Name
        $newRow[2] = $_[2]             ##  DirectoryName 
        $newRow[3] = $_[3]             ##  FullName
        $newRow[4] = $_[4]             ##  Doc Name
        $newRow[5] = $_[5]             ##  Doc Number
        $newRow[6] = $_[6]             ##  Doc Revision
        $newRow[7] = $_[7]             ##  PROC_CODE
        $newRow[8] = $_[8]             ##  SUB_PROC_CODE
        $newRow[9] = $_[9]             ##  DOC_REV
        $newRow[10] = $_[10]           ##  DOC_REV_EXT
        $newRow[11] = $_[11]           ##  EXT

        $dataSet.Tables[1].Rows.Add($newRow)

        $rowCmt = "New row added...["+$dataSet.Tables[1].Rows.Count+"]"

        Write-Host $rowCmt  -ForegroundColor Green
    }
    Else {
        $duplicateRecordsCount++
        Write-Host "Already exists..."  -ForegroundColor Red
    }
}

Write-Host "No. of rows in new dataset >" -ForegroundColor Green
$dataSet.Tables[1].Rows.Count | Select-Object

Write-Host "Total duplicate records :" -ForegroundColor Red
Write-Host $duplicateRecordsCount

Write-Host "Exporting to excel :" -ForegroundColor Cyan
$dataSet.Tables[1] | Export-Csv c:\logs\new_uote_docs.csv

My Journey into PowerShell

One more feather added into my crown, recently I was started using PowerShell from Microsoft. The requirement is simple, our company has a huge pile of documents for a project and they want to put all of them into our newly commissioned document management system named Windchill. So, our consultants gave us a JAVA console application to load the documents into Windchill. That tool required a CSV file with predefined columns. So it all fall onto my head to prepare a CSV file. I was given the location where those documents are sleeping and I went there to only make my self sick. There are just 18000+ documents….. Since I have no other options, I was thinking about how I can prepare the CSV file out of these. That CSV file should have following,

  1. Document Name
  2. Location of document
  3. Document Number
  4. Document Version
  5. Process Code
  6. Sub-Process code
  7. Client document number

So how do I get these? going through each document and note down the details is not an option that I want to exercise. I am a programmer at heart, fellas… I want to do things simple enough so that I can manage.  With some keen observation I found that our engineering guys made my life much easier. They put all of them in the document name itself. So all I need is to parse the document name and put all that information into a CSV file. That’s easy I thought myself. All I need to write a program to do that. Now that’s the problem, my be I was too board of C# or may be I saw a geek video last night, I want to do it in much simpler way that too using some script. There, at that time I saw a blue icon with a command prompt symbol laying on my task bar. Its a long forgotten tool and I open it only when I saw any computer geek movies to only close it once more time. It is PowerShell from Microsoft. Ok, now it seems I got a solution for my problem problem. A quick web search revealed that this little guy can do many things that I want. so it’s time to learn new things. For me learning new things become a hobby and Its pretty exciting to learn new things. I started organizing my requirements, what I want from PowerShell, the algorithm…

  1. Go through all the folder recursively and read all document names
  2. List Name, revision, document number and physical path to the document.
  3. Push them into a CSV file and I am done….

Now my milestones to achieve this,

  1. List document names by recursively reading through the folders
  2. Check how to parse the document names and list them
  3. Finally push them into a CSV file

Listing document names recursively,

Fall back to my favourite search engine, BING and learned that “Get-ChildItem” will do the job for me. Here is my output Get-ChildItems

Ok, now I achieved my first goal and moving onto my next one. This time I want to list the documents recursively. Its easy part, “Get-ChildItem -recurse” will do the job. That’s cool, I crossed my second milestone too. Now my third step, how to sub-sting a string? We can achieve this by using “substring()” function. This function takes two parameters, one is from what index to start and another one is length of string to extract. See the following output, SubStirng

Ok, going good. Now I need to pump the document name to substring and list it as individual columns. But wait, my output is not correct, I want only two things, document name and document physical path. Once I have these two, I need to extract remaining columns from document name. One more things, I don’t want to bring the folder names but only files within them. So, its time to play with “Get-ChildItem“.

My immediate thing to do is exclude the directories, I can get this by “Get-ChildItem -Attributes !Directory“. Lets see the output;

Original:

Get-ChildItem-All

Here all highlighted ones are folders in the current folder. Now lets apply those switches;

Get-ChildItem-NoDIr

They are gone… Great now its time to change the output. I want only name of the document and full path to it. We can achieve this by piping the output into another commandlet, named, “Select-Object“. Here is the output;

Get-ChildItem-NoDIr-NameFull

At this point, I am able to get name of the document and full path to it. Now I need to extract the desired columns, remember I need to get document number, version.

Now a little confession to make, the output till now you are seeing in the above screenshots are not from production server. Therefore the names are not accurate. Actual file names are like this “TP123-8080-DWG-0001-Rev0 Some Name.Ext“. In this name, I am interested in first part of the file name. That contains the document number and revision and hopefully they are accurate. So all I need is to extract the file content till the first space. Let’s see how we can achieve this.

To make a substring from the input from Select-Object we write an expression as “{$_.Name.SubString(0, $_.Name.IndexOf(‘ ‘))}“. Look into the output;

Get-ChildItem-Expression_1

If you observe closely, the column name is nothing but our expression. But that’s not what we want, let’s give it a name. For that purpose, we need to change the expression slightly to “@{Name=”DOC_NUMBER”; Expression=$_.Name.SubString(0, $_.Name.IndexOf(‘ ‘))}}“. Following is the output;

Get-ChildItem-Expression

Here I need to refine the DOC_NUMBER expression to remove revision part. This is because revision number is not part of the document number. My refined expression will be “@{Name=”DOC_NUMBER”; Expression={$_.Name.SubString(0, $_.Name.SubString(0, $_.Name.IndexOf(‘ ‘)).LastIndexOf(‘-‘))}}

Great!!! Now lets add revision number to our list. To achieve this, we reuse the document number expression and take everything after last index of “-“. Since, the document number is has fixed length, i.e. 19 characters we can substring from 19th character to rest. Here is my expression “@{Name=”DOC_REVISION”; Expression={$_.Name.SubString(0, $_.Name.IndexOf(‘ ‘)).SubString(19)}}” and following is the output;

Get-ChildItem-Expression-REv

Now I am almost done, all I need is to export this to CSV. To do that we pump this output to another commandlet named, Export-Csv and passing the name with path. Here is my final PowerShell command and its output;

Get-ChildItem -Attributes !Directory -Recurs | Select-Object Name, FullName, @{Name=”DOC_NUMBER”
; Expression={$_.Name.SubString(0, $_.Name.SubString(0, $_.Name.IndexOf(‘ ‘)).LastIndexOf(‘-‘))}}, @{Name=”DOC_REVISION”
; Expression={$_.Name.SubString(0, $_.Name.IndexOf(‘ ‘)).SubString(19)}} | Export-Csv c:\logs\exp.csv

Excel-csv

That’s it fellas, I hope you learned something today. What I realized is, PowerShell is really powerful. It is integrated into system so much so that we can do virtually anything in the system.