Wednesday, February 28, 2007

The Evil and Annoying WHITE SPACE

I hate white space. I loathe white space. There are very few things I will say that I hate, and white space is close to the top right now. It has been the cause of countless mind-numbing hours of work on projects at my job and in school.

The two most recent encounters with the villain have occurred over the past two weeks. The first error happened while I was working on my web service project, which I developed in PHP. You can read more about it under the Information Architecture section of my blog. The lesson I learned: Do not leave any extra white space before or after the php tags (). If you do things will go haywire.

The second occurrence was yesterday. I spent an extra 2.5 hours at work trying to figure out why to hostnames, when compared, were exactly the same but the php script thought otherwise and completely skipped over them. I looked at every thing that could possibly be wrong and had absolutely no luck. The thought then occurred to me, "Could there be some white space lurking about in some dark corner of my script?" I added the trim() function (a function in php that strips both sides of a string of any white space) to the variables being compared, and sure enough, it worked. The "in-famous" white space had struck again.

I guess you could count my encounter two weeks ago as some sort of a blessing, as my mind was still some what fresh on the frustrations white space had caused me before. Hopefully these experiences will be engraved in my brain so that in the future I will be able to diagnose the problem more fully. On the other hand, it was kind of nice to see how easily the problem could be fixed, and that once the white space was eliminated, the program worked perfectly!

Tuesday, February 27, 2007

Omniture Competition Prelims

As an assignment, we were required to participate in the Omniture Web Analytics Competition held here at BYU. My group had been bogged down for a good part of the time we were given for analysis by other projects (if you read the Web Services Project Update, you'll be able to see why I had so little time for anything else.) So the last week we put our heads together and came up with what we thought was a fairly solid approach. Since the competition is over for our team, I might as well disclose our approach. We catered our presentation to the client's mission statement that we found on their home page (costumecraze.com).

We aimed our analysis at their goal for world-wide growth, improved online experience, and unbeatable prices. Though we didn't spend loads of time on the analysis (there's only so much you can do and be able to present it in only 6 minutes) we felt like we found some pretty solid recommendations that we could give. We rehearsed a couple times right before the real presentation, and felt like we were ready.

The presentation went well. I did my usual. I get really nervous, trip over my words, and deliver incomplete thoughts. Though I think I explained it well enough, as there was only one question directed to me and I was able to answer it. My other team mates did well, and with as few questions as were asked us after our presentation, we felt we did pretty good. Maybe we really did do pretty good, and everyone else did that much better, who knows. I think I might still go and see the final competition and try to win myself an ipod nano, that is if my wife isn't in labor :)

Thursday, February 22, 2007

Omens, Luck, and Parasitic Trash



Today as I was driving down to Provo, engaged in an Anne Perry audio book, a large piece of trash was floating in the air threatening each car that passed under it with attachment. I, too, was hoping to be spared driving the remaining half of the journey with this piece of trash on my car. I was not so fortunate. My overly conspicuous antenna grasped the trash and waved it in the air for all to see. It was over one and a half times the length of my car. I'm sure the drivers around me felt my pain, as they had all hoped to be spared the attack of a lifeless parasite; but the drivers I would encounter ahead of me would laugh me to scorn and I had no way to say, "I didn't put it there, honestly!" It was a good 5 to 10 minutes before the leech released itself from my car and attacked another car a few cars back. The rest of the drive was fairly normal.

I ended up getting to class a few minutes late, but before the instructor had started teaching--my first stroke of luck. The lab we were doing went fairly well, considering the fact that several of the people around me didn't have their computers configured properly (as none of us were sitting at the computers we had personally configured last class period). I, of course, had no troubles at all with my lab, and was able to complete it fairly smoothly. Luck had stricken me again.

I was told in my lab that I would have to do a pre-presentation of our website analysis during class (when I was hoping to be given that class time to prepare for the real presentation this evening). I thought my luck had ended.

Bowling was next. The last time I had bowled in a suit, I had performed miserably. Today however, was probably one of my best days ever. I first bowled a 131, not too bad considering my scores the last few weeks. But I was just warming up. I finished the hour bowling one more game scoring a lifetime high of 186, with two strikes in the last frame! Thank you Lady Luck!

To add the cherry on top (not a maraschino one, I don't like that kind), I don't have to present in class today, the person who informed me was misinformed, or was trying to pull a prank (and actually did, for I was fooled and scared).

This concludes the tale of the Parasitic Trash.

Tuesday, February 13, 2007

WA: SiteCatalyst, Take 2

The time Dr. Liddle spent in class last week going through some of the online tutorials for Omniture's SiteCatalyst was very helpful. With as much as we have going on during the week, I would not have time to watch these videos otherwise, and they are very relevant to what I feel the purpose of the class is. I also agree with Ben Robison, in that we should make the class a discussion around the topics for which Dr.s Liddle and Rogers decide to show certain tutorials.

Specifically, we discussed the importance of key performance indicators. I remembered studying that term a little in my marketing class, and was exposed to it even more last year in my consulting class. We had to make sure we fully understood the expectations of the client for whom we were working, so that we could measure our success by the agreed upon metrics. We were able to better focus our meetings on those topics the client cared most about. What Omniture basically does is provide a service by which companies can better determine how their KPI's relate to their website, and how to better implement their online presence.

I've begun toying around with SiteCatalyst and find it fairly easy to work with. The hard part will be optimizing the data and combining it in ways to give meaningful data. 2 weeks doesn't seem like very much time to play with a tool like this, maybe our license will be extended because of our class :)

Saturday, February 10, 2007

IA: Executable Conceptual Models

Our assignment for the next class is to read over the Model Driven Architecture paper provided to us by Dr. Liddle. I admit I had no clue what this was referring to at first, but as he began to give us an introduction to it on Wednesday it made more sense. About two years ago I took a couple of Systems Analysis and Design classes where we did something very similar to MDA. Some of the acronym's this paper is using include:

  • CIM- Computation Independent Model (Domain Model)
  • UML-Unified Modeling Language
  • PIM- Platform Independent Model
  • PSM- Platform Specific Model

The CIM or Domain Model is the first step in MDA. It is a model that is created independent of any computation your system will do. This is important to help you better understand the way in which you desire your system to perform without having to worry about any of the complicated computations that will need to be done. This model is usually drawn up using UML, which is a standard modeling language.


Once this is done a Platform independent model is created, a little more specific describing the system, but not specific enough to determine how it will interact with the platform it will be designed for. It might consist of "enterprise, information and computational ODP viewpoint specifications."


The PIM is then "transformed" into the PSM. It "specifies how that system makes use of the chosen platform." The PSM could easily be an implementation, unless you decide that it needs further revising, in which case it is more like a PIM, in that it is only a model. This step could very will begin to include the program code, deployment descriptors, and other forms of configurations specifications.


That is pretty much it. The paper goes into further detail on methods of transformation, how to actually use the MDA Pattern, and some MDA standards.

Thursday, February 08, 2007

Fortune Cookie


"Remember three months from this date! Your lucky star is shining." I had some really good Chinese food, $1/item! As much food as Panda Express' two item plate (plus rice) for only $3.54! If you're here in Provo, check out Fortune Chinese Buffet on University Ave. (about 450 N).

Wednesday, February 07, 2007

The Pornography Problem

Today in Data Communications our professor, Craig Lindstrom, decided to take a break from setting up VLANs and change topics to a conversation that has gained a lot of steam on our email list entitled, "Sensitive subject,but it's a real problem we as IT people can't avoid." That subject has to deal with objective content that finds its way UNINVITED into our homes, particularly pornography. (I hope this blog doesn't get flagged just because I use the word.)

We started off trying to define it, and we ended up back at a well known court decision defining obscenity. "In 1964, Justice Potter Stewart tried to explain 'hard-core' pornography, or what is obscene, by saying, 'I shall not today attempt further to define the kinds of material I understand to be embraced . . . [b]ut I know it when I see it . . . '" ( Jacobellis v. Ohio, 378 U.S. 184, 197 (1964)) To some extent this is true. To some people what is arousing or sexually stimulating is not to another person. Granted there is quite a bit of material that requires no weighing of the facts and is outright pornography, end of story (e.g., Playboy).

What about "art?" Is the Venus de Milo considered pornography? Rodin's, Michaelangelo's, or any other renowned artist that saw the human body as one of the most beautiful of God's creation? You could go into the argument that maybe it has to do with the intent of the author, or creator. Did they intend for their creation to be sexually stimulating? Obviously we cannot ask the Rodin's and Michaelangelo's this question. To some this material is offensive, and they choose not to view it. To others it enriches their life and appreciation for God's creations.

What then can we do to make sure that the material that we find offensive remains out of our homes and off of our computers? Currently this is virtually impossible. I think we could well say, however, that the accidental encounters with such obscene material has lessened over the years, thanks to pop-up and java script blockers. There is an initiative being started referred to as CP80 (Clean Port 80). You can check out the full description of the initiative on their website. Basically it is trying to get legislation to pass that will force all websites to be associated with a specific channels, similar to cable television. If you want HBO, or some other channel with adult content, you can pay for it and get access to it. If you don't want, you don't have to pay for it, and you shouldn't have to worry about it. This solution still provides freedom of speech to those who wish to enshroud their lives with such filth.

A slightly different solution was discussed in our email list regarding instituting an internet rating system similar to the one used by the MPAA, or on local television networks. If the content was rated, you could choose to be connected to the 'G' and 'PG' ports, and not to the others. This brought up the issue that some material on the internet cannot be categorized like that (e.g., medical information).

Another issue that was brought up was the fact that many times people, especially kids, accidentally run into obscene material. This happens when they innocently misspell something in the URL disney.go.com/playhouse/today/index.html. If browsers had specific login accounts for different users, they could create a plug-in that would check the spelling of the url typed, and if it wasn't spelled correctly offer a list of possibilities with a description of the content on the pages found. That way small children aren't accidentally stumbling upon objectionable material

Class was just about over and our professor decided to give us his idea of what could be done. He suggested to stick with the rating idea, but in a more objective way. Ratings would describe the type of data contained on the page: violent, erotic, medical, educational, entertainment, etc. That way all data could be categorized, and you could choose the type of data you would allow on your computer.

Well, this is all for now, I could probably go off for a while on this, let me know your thoughts on it, I'd love to hear what other people think.

Tuesday, February 06, 2007

IA: Web Services Project Update

02/01/07
I gave Visual Studio another try, and with the help of a tutorial on web services I was able to actually set one up!!! I'm still far from being a professional, but I at least understand a little bit more of what's actually going on. The thing that actually confused me was when I selected "Create Web Service" in VS2005 three files were created: web.config, service.asmx, and service.vb. I'm still not sure what they each do individually, but, I pasted some functions that converted between Celsius and Fahrenheit, and then selected build, and the service was built. And then, I didn't know what to do. I went further on through the tutorial and saw that if you wanted to integrate this service into your website all you had to do was reference where the service was. I created a simple .html file and voila, It worked. I'm still not sure what the .vb or .config files actually do, or where SOAP and WSDL actually come into place, but the .asmx file is the place where you create the "web methods" that your service needs to perform.


02/02/07
I was able to help some one else get going with .NET, at least to the point I had reached with the web service I created on the first of the month, which was a converter between Fahrenheit between Celsius. So that helped solidify my knowledge a little bit.


02/05/07
I'm still trying to understand how to piece everything together in .NET. I created another simple web service from a tutorial on Microsoft's web site, and it helped define some of the different files VS creates with any given web service. They referred to the .vb file as the web service, the .asmx file as the ASP file (I'm still not sure what this does). And the web.config has a lot of xml in it, and I wonder if it has anything to do with the WSDL. The web service I created was a simple math service. You can choose any of the basic four operations and plug two numbers in and it will give you and answer in xml. When you select the operation, you are taken to a page that has two text boxes labeled A and B in which you can put the numbers you want operated on. Beneath those boxes are two SOAP examples. I imagine they're giving us an example of how our web service is communicating with the server where the code lies. As far as I understand from yesterday's class, we don't need to make a web page, just a service that provides enough formatted requests so that it could easily be tied to a web page if need be. We just need to get the information from the weather database in Dr. Liddle's MySQL DB and format it correctly on our side.


02/09/07
I began to lose hope that I'd ever understand web services. I understood the concept of them. A WSDL is basically a contract that lives on the server and allows subscribers to view it to know what types of methods they can call to use the web service on that server. The SOAP requests and responses are the way the client and server communicate. This is great to know, but how to practically put it to use was a totally different question. I've had some experience with Java so I decided to try something different than the Java EE route. This could have been my first mistake. I installed Visual Studio (as mentioned above) and had a little success with it, but I still had no idea where the WSDL was, how it was created, or how the service was even running. There was too much magic going on behind the scenes.

Luckily I was notified of a WSDL seminar that Jimmy Zimmerman was going to teach at this morning. He made it so much easier to understand. He took a random WSDL and showed us how he modified it to work with a simple math web service he created. Dr. Liddle did something similar, but it was with an already relatively complicated web service, and even Jimmy couldn't follow him. Suffice it to say I think I'm going to take the PHP route. I'd still like to try to better understand .NET, but with a due date on this assignment, I don't want to kill myself trying to get it done.

I'm going to spend a few more hours today trying to put together a WSDL myself. I'll let you know how it goes.

02/19/07
This project is driving me nuts. I've spent countless hours trying to get it straight, and I have had no luck. I've talked to people in the class, I've consulted every web source google points me to when I search for the errors I have. I've talked with Dr. Liddle and got a little help, but I haven't had much of a chance to talk to him since.


The part I cannot get, nor can anyone else I’ve talked to who’s doing it in php, is how to return a complextype. I keep getting this error:

Fatal error: Uncaught SoapFault exception: [SOAP-ENV:Server] SOAP-ERROR: Encoding: object hasn't 'cityzip' property in C:\wamp\www\client4Liddle.php:6

Stack trace:
#0 [internal function]: SoapClient->__call('getWeather', Array)
#1 C:\wamp\www\client4Liddle.php(6): SoapClient->getWeather('84602')
#2 C:\wamp\www\client4Liddle.php(13): getWeatherDetails()
#3 {main}
thrown in C:\wamp\www\client4Liddle.php on line 6

This assignment is due tomorrow, and I don't know if I will be able to finish it. I'm getting ready to throw in the towel. Maybe I'll write something about the business case for something I can't figure out.

***Later that day...***
Jimmy Zimmerman saved the day again. He had run into the same problems that I was having. Apparently, PHP doesn't like the degree symbol, and for that reason was throwing the above error. Another problem I was having was putting spaces in the elements of my associative array (i.e., $weather[City Zip] = row->cityzip;). SOAP doesn't like the spaces, and will throw an error in that case as well. You cannot have any leading or trailing white space before and after the , or the SOAP client will not be able to get the WSDL. After these issues were resolved, I was able to finally finish this project.

Business Case for Web Services and SOA
I'm not completely turned off by web services just because of this assignment. With all of the searching I've done on Google, I've definitely seen that web services are widely used in all sorts of areas. What I basically understand of web services is that they provide a resource for any body who desires to call their information. With the example of the weather web service we were asked to implement, anyone who wanted could write a requester (client) to connect to the web service that I wrote (if it ever worked). They could have the information that was retrieved plugged into their website.

I think the best example in class was that of the personalized Google homepage. You can insert any number of web services onto you page to give you news updates, weather updates, games, etc. This may not be the best example by which Dr. Liddle would have us undertand web services, but he didn't discount it.

Using Google as an example, I could see web services being of great benefit to any enterprise. In my web analyitcs class we are currently beginning to understand what dashboards are and how they can be useful. A dashboard gives you a quick visual overview of several aspects of something you wish to be monitoring. Copious amounts of information can usually be gathered by a well selected combination of graphs, statistics, or other reports on these dashboards.

A company's internal or external website could be composed of several web services that provide a similar service to that of Google's homepage, but more of a dashboard for specific updates on the progress of certain products, order information, etc. With order information you could integrate a map service with a shipment tracking service that would display on the map the approximate location of your order with the number of stops left to make and the approximate arrival date and time. Any delays could be annotated on it as well. For internal information, several teams are usually working on the development of a given product. These teams could post updated statistical information from the various tests they would be performing throughout the development and polishing of the new product.

A generic Service Oriented Architecture:



The concept of web services was discussed in the context of a Service Oriented Architecture. We had a difficult time defining SOA in class, but came up with some good attributes of this architecture:
  • Loosely coupled
  • Highly cohesive
  • Web services
  • Collection of communicating systems
  • Enterprise Service Bus
There are probably plenty of other attributes, but this is a good start. With loosely coupled comes less interdependence, and thus a smoother running system. When a system is weighed down by all it's interdependencies, if one piece doesn't work the whole system could easily go down.

Highly cohesive goes right with loose coupling. Each piece of the system is self-contained. Because of the low number of interdependencies, each individual part of the system is dedicated to performing one aspect of the system well. Web services were explained above.

With a collection of highly cohesive, loosely coupled systems, the collection of systems must be able to communicate well in order to provide a valuable service. An enterprise service bus is a back bone that all the systems can plug into, it's what provides the communication ability to the different services. The ESB is made up of a lot of machinery, and allows for lower coupling. The lower coupling comes from the fact that each system can link up to the ESB, not requiring them to be linked together, just to the ESB. Here's a diagram of an ESB that IBM has posted:

Overall I think that Web Services and SOA are a very good solution for businesses. A modularized type of environment allows for easier implementation of change. Organizations are constantly undergoing change. New technologies emerge that will give companies an edge over the others (at least until the other companies implement the same change, in which case it becomes a necessity to keep up with everyone else on the technology wave.) If each of the services companies offer are loosely coupled, and connected to an ESB, each module could be changed without affecting the entire system.

This concept is very similar to hot swappable media used in enterprise network devices. Different components of the switches or routers can be replaces without having to unplug or reset the device, allowing for a seemingly uninterrupted change to the enterprise network.

Monday, February 05, 2007

Roadkill and Flannel Jackets


Today I had to drive down to Provo for school, and luckily I had my wife's car which has special permission to travel in the car-pool lane by myself. I was cruising along on the empty morning roads in happy valley groovin' to the tunes of the Aquabats. As Captain Hampton was running away from the surprise attack of the fierce midget pirates of Willy Goat, a dead animal appeared right in the middle of my lane. I didn't have time to swerve out of the way, so I tried to align my car up so as not to hit it with my tires (it was a rather big obstacle, about the size of a bobcat). I didn't see it come out from under my car through my rear view mirror...uh oh. I heard something banging up against the bottom of the car and I was hoping the animal wasn't stuck to the bottom of the car. I could almost smell the rotting odor of the poor animal, and could imagine everyone looking at me as some cruel, unjust, animal abuser..."But I didn't mean too!!!" I tried changing lanes to see if it would come loose, but that didn't work either. I bit my cheek and made it all the way to school, hoping not to find anything stuck to the car. As I got out of the car, I looked under the car, and there it was...a large red flannel jacket! I had to laugh out loud, that was such a relief! And that was the beginning of my day!

Friday, February 02, 2007

WA: Jim Burchell, Omniture

Jim Burchell of Omniture came and spoke to us a little about the history of web analytics, and then spent the remainder of the time answering any questions we had. He's worked for three web analytic companies over the last 6 or 7 years: WebSideStory, WebTrends, and currently Omniture.


We discussed the difference between analytics that analyze the web server's log files (analyzing all the HTTP requests) and putting JavaScript on your web page. Log files are less accurate, they tend to either inflate or deflate certain numbers. For example users whose ISP is AOL tend to be aggregated behind IP addresses. With only examining the log files, that one IP address will only appear as one user to the web server, and will only count as one unique visitor, greatly deflating the actual number.


With the JavaScript technique, page views are counted a lot more accurately, as they can actually count each time a page is viewed. On the other hand, the log files have a harder time distinguishing between page hits and page views, and what should actually be counted as a page view, the numbers tend to be a lot higher than normal. This particular problem causes problems for companies like Omniture, because when they institute their method, it appears that their client company's web traffic is actually decreasing.


A question was posed on the difference between Google analytics and what Omniture does, while this may seem a naïve question to ask, I think Jim answered it very well; I don't doubt that any one in the class had a clue as to the answer. He said Omniture provides depth. For example, take someone in Oregon who looked at some brown shoes on Nike's website, but didn't end up buying them. Omniture could tell you the different regions where certain people looked at a certain product as well as their age, which provides the company with very accurate data with which to remarket to their customers, making their marketing endeavors more successful.

Thursday, February 01, 2007

WA: SiteCatalyst

I realize this might not be read today, as it is coming in a little late. The guest speaker last time was great, his brief introduction into Omniture's SiteCatalyst really got me excited for the chance we'll have to use it in the competition and in class. I had no idea the amount of information that can be monitored on a website. I never really thought about the ability to track where certain people got hung up on a company's shopping cart and ended up dropping out. That information would be vital to increase total revenues.

The brief insight into their new Discover application also sound awesome. The ability to segment your results by different markets will be very well accepted. I'll update more of what SiteCatalyst can actually do as I get more involved with it.

IA: Setting up Tomcat and Axis

I realize that technology rarely works the way it is supposed to, and so I was not the least bit phased by the trouble that was experienced setting up Tomcat and Axis in class. It really doesn't seem terribly difficult. The reason I'm choosing not to use Java with the web service assignment is because I would like to learn something new.

So far I've tinkered around a little bit with installing Ruby and InstantRails. I don't think I've been able to set it up successfully yet. I haven't spent loads of time on it, but enough to realize that it is not cake.

I've also installed Visual Studio 2005 to try out the .NET way of creating a web service. I've also ran into some barriers there, and it is not as easy to understand as one would hope. Nonetheless it looks like a very powerful application, and so I think I'm going to pursue it a little bit more, along with Ruby and see which one takes me where I need to go quicker.