December 29, 2008
This blog has been migrated to http://blog.augustoalvarez.com.ar/
I will continue posting there,
November 15, 2008
It’s been a few weeks since its release but I finally managed to put my hands on to Hyper-V Server.
I was very curious about it: A free operating system released by Microsoft working only as an HyperVisor it makes wonder about a lot of things. Also recently I’ve been working with VMWare ESX Server 3i, that is also the hypervisor working directly on the machine, and I had a good experience (I really loved the monitoring and reporting features that you can use).
From the moment I started using Hyper-V Server few troubleshooting tasks needed to be done.
Installing Hyper-V Server
If you ever installed any operating system, ever, you should not have any problem with this. You’ll of course see that the process is identical from Vista and Windows 2008.
To get started with Hyper-V Server there’s available the Hyper-V Server 2008 Configuration Guide.
If you want to avoid almost any command line to be executed from now on, Hyper-V Server has a simple tool where you’ll load a menu to access most of the configurations you will need. You can access it using this cmd:
But I’ll execute the next steps using the command line features, so this procedure will apply as well for Windows 2008 Server Core.
To start using Hyper-V Server you will need Hyper-V Console on your Vista SP1, it is the same console to manage remotely any other Windows 2008 with Hyper-V. If you don’t have it yet, you can download it from here:
But, from this moment I started to have a few problems.
1. Solving "Access denied. Unable to establish communication between: <Hyper-V Server> and <Vista client>"
For all of those who were using the early versions of this remote console probably had the same error.
The solution is the same, so I want to reference this post from John Howard’s blog; where it explain almost everything you must know about configuring Hyper-V role on a Windows 2008 Core Server. Hyper-V Server works the same way as this Core version of Windows 2008, so every step of configuration will apply.
Here’s a quick summary of the steps involved, I’m only applying the steps I considered necessary for my environment.
1. Since I’m using a domain environment, I joined this machine to the domain using NETDOM utility:
netdom join <ComputerName> /domain:<DomainName> /userd:<UserName> /passwordd:*
/passwordd: * Requires user password to be entered.
Reboot the machine to apply the changes:
shutdown /t 0 /r
2. Adding necessary rules on the Firewall to allow remote connections.
a. Remote Management:
netsh advfirewall firewall set rule group="Remote Administration" new enable=yes
Note: You can also use netsh to change server’s IP, using the following syntax:
netsh interface ip set address "<Adapter Name>" static ipaddr subnetmask gateway metric.
b. Enable Remote Desktop
cscript \windows\system32\scregedit.wsf /ar 0
cscript \windows\system32\scregedit.wsf /cs 0
c. Reboot the machine to apply the changes:
shutdown /t 0 /r
3. Solving the "Access denied" error from the client:
Now that the server is properly configured for remote management, you have to run a simple procedure to fix this common error:
a. On "Run" insert "DCOMCNFG". Click OK
b. Expand "Component Services", expand "Computers". Right click on "My Computer" and click on "Properties" (imagen)
c. Now click on "COM Security"
d. In "Access Permission" click "Edit Limits"
e. Select "ANONYMOUS LOGON" in "Group or User Name". In the column "Allow", set the "Permissions for User" with "Remote Access".
Now you should be able to connect remotely using the Hyper-V console.
Since I finally completed the Hyper-V Server configurations for remote management, so the obvious next step is creating a new virtual machine.
I started with a dummy virtual machine, just for testing. But in the last step of the virtual machine creation wizard I got "The virtual machine could not be started because the hypervisor is not running." Ouch!
2. Solving "The virtual machine could not be started because the hypervisor is not running"
You should not worry if you see this error. There’s a good chance that your hardware is not the problem and that the hypervisor feature on your processor it is running.
Even though that the hardware on your server supports Hyper-V and that the service is correctly installed, what happens is that the hypervisor was not added on the boot environment and the service was not started.
To solve this, you only need to run this command line:
BCDEdit /set hypervisorlaunchtype auto
Ok, NOW you can start using Hyper-V Server.
Adding Features to Hyper-V Server
Don’t get all excited, as we mentioned before, this is just an HyperVisor and you should not expect that much functionality available.
Most of the features (not roles) that you can install are there to increase security and to achieve interoperability with other platforms like System Center Virtual Machine Manager or Data Protection Manager, supporting Live Backup (backing up virtual machine without downtime) as well.
To access all features available, as in Server Core, from cmd:
To install one of the features use: start /w ocsetup <NameofService> (for instance, I installed on this Hyper-V Server the TelnetClient)
You’ll find as well that Hyper-V Server includes a WMI interface for remote management extensibility. Here you can find more information:
Hope you find it useful.
November 15, 2008
The event was mainly organized by an old Student Partner collegue Fernando Hualpa together with Ángel Arcoraci and Marcelo Quevedo.
On the Microsoft side, I made the trip with Pablo Listingart and the one and only Alejandro Ponicke.
Some of the topics on this one day event were:
- XNA: Little story about gaming and with a short example about XNA.
- Expression Blend: Using Expression suite of Visual Studio.
- Hyper-V: Using Hyper-V together with Physical to Virtual migrations.
- Windows 2008 + IIS 7: Linux interoperability.
Ponicke’s machine with 192 cores and 200gb of memory
Even though I didn’t have much time to walk around Mendoza, it is a lovely city that I recommend to visit. I really hope I get the chance soon enough to spend some time there.
November 10, 2008
Ok, for several weeks now I’ve been a twitter user. It started when I saw a little sketch about the evolution of blogs.
Sounds like a discuss topic isn’t? This "twittering world" is something that got me thinking: why it does exist and why so many people us it?. I’m sure that can be a lot of answers about it, from twitter lovers and twitter haters.
I’m pretty sure that can be hundreds of examples where people only twitters for no reason other that to twitter (and to get other people’s attention). But what about if it’s more than that? What about if you also want to share experiences, knowledge, information? It would take something like a geek twitter to do that… but be certain that there are million of those from whom you can learn a lot of things.
And what about if you it’s not a geek twitter? I think this is still a great social experiment where you can learn a lot of things, like start to knowing other persons on their normal (not in a geek way) life. I’m a believer that smallest and simplest aspects on people’s life is what defines a person. Why not twitter can be a tiny way to understand all that?.
Don’t want to make a whole deal of philosophical discussion about it; so I want to share with you my experience and some of the people that I usually follow:
My user: @augustoalvarez
Angel "Java" Lopez: @ajlopez
Johnny Halife: @johnnyhalife
Martin Salias: @martinsalias
Paulo Arancibia: @parancibia
Gabriel Szlechtman: @gabrielsz
Ezequiel Morito: @ezequielm
Federico Boerr: @federicoboerr
Miguel Saez: @masaez
Matias Woloski: @woloski
Ezequiel Jadib: @ejadib
Juan Manuel Moyano: @jmmoyano
Edgardo Rossetto: @erossetto
Christian Linacre: @linacre
Alejandro Ponicke (not so much a twitter fan so far): @ponicke
Oh and if you ask, twhirl is my preferred twittering tool
November 10, 2008
Since forever I’ve been excited about taking a Microsoft’s certification: the different types of certification you can get within Microsoft technologies, the deep knowledge and experience, and the recognition value were some of the things that always made me feel like good reasons to get certified.
But I never thought that my first achievement regarding to this was going to be a SQL Server certification: Microsoft Certified IT Professional Database Administrator (SQL Server 2005).
I never was an expert on SQL, but it sure sound like a great challenge to take when we started discussing this with my team-mates. So, we started a few weeks ago with some practicing and learning practically from scratch in many aspects.
70-431 SQL Server 2005: Implementation and Maintenance
This is in deed the exam where you should start your SQL certification. Not only you get to know about installing SQL Server, you also get to know the differences between the different versions, and several basis about SQL management.
This exam has 35 questions and 12 simulations. The simulations are mainly oriented about using backup features.
More info here.
70-443 Designing a Database Server Infrastructure by Using Microsoft SQL Server 2005
This exam was a little bit strange comparing to the other ones. For start, you’ll see that the exam is based on case studies: A wide description of a fake company where you can find the current platform that is using, business goals, objectives and much more.
You’ll find that the design (meaning your answers) on each case study will be different, because of the differences on the scenario described.
6 case studies with 6 to 11 questions on each one. Important note about it: Once you complete a case study and if you did not review any of these questions, you won’t get the chance to review them if you pass on to a different case.
No simulations on this one. More info here.
70-444 Optimizing and Maintaining a Database Administration Solution by Using Microsoft SQL Server 2005
This is a more depth version of the first exam, 70-431. You can find several concepts where you can improve your database infrastructure as well you get to know some concepts in a more advanced level.
52 questions with no simulations. More info here.
70-444 completed! Yeeeiiiii! Score: 1000
What about SQL Server 2008?
If you are already a MCITP DBA on SQL Server 2005, you can take 70-453 exam (Transition your MCITP Database Administrator Skills to MCITP Database Administrator 2008) and upgrade it to SQL Server 2008.
And the good news, if you don’t have any certification, you can directly access MCITP DBA SQL Server 2008 by taking one exam: 70-450 Designing, Optimizing and Maintaining a Database Server Infrastructure using Microsoft SQL Server 2008.
Here you can find a poster about SQL Server 2008 certifications.
Hope you could find this information useful.
October 11, 2008
Another Microsoft’s Academic event and another great time that we could spend with Southworks crew.
This time the UAI (Universidad Abierta Interamericana) received the Code Camp. Hundreds of students and professionals attended to this event organized by Microsoft Universidades that introduced over 40 conferences along the day, on which I had the pleasure to present a renewed version of Windows 2008 + II7.
On this occasion, besides working IIS with PHP, I showed a simple way using a working Linux + Apache server, making this an iSCSI server that can provide storage to remote machines, that they see this remote storage as local attached hard drives. And of course, all the benefits of serving this sites on IIS7 with Failed Request Tracing, Reporting, etc.
For those interesested, here’s the ppt file from my presentation (spanish): windows-server-2008-iis-7-interoperabilidad.
NAS solution using iSCSI (graphic from http://www.wikipedia.org/)
Taking a side all the geeky stuff, it was really fun working on this event with Southworks and other friends that also participated as speakers or organizers. The other southies that presented here: Alberto Ortega, Matias Woloski, Johnny Halife, Pablo “Lito” Damiani, Ezequiel Jadib, Martin Salias, Angel “Java” Lopez, Paulo Arancibia, Federico Boerr and, now a former southie, Miguel Saez; also good and old tech partners like Alejandro Ponicke and his crew participated. And talking about my home town, GENTI and .NetSgo (academic cells also) came all the way from Santiago del Estero to participate.
Some of the southworks crew at the event stand
Introducing the company to a few people also
With Miguel Saez
And continuing with the academic events, I’ll also visiting Mendoza for another Cells on Camp event on November.
We’ll see each other then!
September 7, 2008
Recently, on August 25 and 26, I attended as a speaker on a Cells on Camp event (a preview of another Microsoft event, Code Camp) given in Santiago del Estero (Argentina), my home town. I presented about IIS7: Interoperability with PHP.
It is always special to participate on any event, but I have to say that visiting back my home town (I don’t do it all that often) and my University it has another type of feeling. My professional life began with those events and activities: I belong (now as a remote member) to a study group called GENTI (Information Technologies Study Group); we started it at October/November of 2005 with a group of students and professors of the University, together with Alejandro Ponicke (Microsoft’s South Cone IT Evangelist) as a mentor. We began learning everything from scratch, we didn’t have much experience or knowledge but we sure knew what we wanted and we went after it.
On our first public event (May 2006) everything went perfect; the auditorium was too small for all the people that attended (~ 350 on the first conference and with 300 seats available). Alejandro Ponicke supported us with a few conferences; I also participated as a speaker. And I met a great friend of mine and co worker: Johnny Halife. Because of that event, now I am working here in Southworks.
Code Camp 2007: Memories with Johnny Halife
So, whenever these types of academic events appears, no matter what technologies are involved, I think of them as great opportunities for any young students and the whole community behind it. That’s the case for my own University and my town; it always had this lack of technological events, that do not contribute on making their own future professionals to be updated with all the constant technological advances.
Making this small contribution with a conference in Santiago del Estero, makes me real proud of all of the work and ideals we had, and still have, when we started with GENTI. I really hope that those types of events start being traditional and appears really often to all the community.
On the meanwhile, I’ll start preparing my next conference at Code Camp 2008.
We’ll see each other then.
August 3, 2008
At this point we’ve already installed and properly configured Windows Deployment Server on Windows 2003/Windows 2008 (Part I); and we created a full image (programs and features installed) on our Windows Vista and uploaded it to the server (Part II); the only thing missing is creating the answer files that will be used on the images to achieve the full unattended installation of our operating system.
For those using Windows Server 2003 SP1, we reviewed that among the requirements for WDS installation there was installing Windows Automated Installation Kit. This kit also gives us an important tool for the unattended files creation, the Windows System Image Manager.
So, it’s recommended for any other platform used on WDS to download this kit and install the System Image Manager. This tool it’s not a requirement for creating the unattended files.
Preparing the Files Using System Image Manager
System Image Manager provides us the way to, using the .wim (or .clg) file for an installation, select the components that are necessary for the answer files. This way we can be sure that the answer options selected are used on the right place at the right time:
Open System Image Manager for Start Menu.
Click on File and click on Select Windows Image
Select the .wim file that we previously created or just use the file from the installation media (install.wim).
You can also select the catalog files (.clg): these are the specific files for each Windows Vista version (Home Ed, Enterprise, Ultimate, etc).
To start creating, on the File menu select New answer file.
Adding a .wim or .clg files is not a requirement, but you won’t be able to validate or check the errors on unattended files that is not using an OS image as a reference.
For more information, visit the site Windows System Image Manager Technical Reference.
We’ll create two files that are necessary for a complete unattended image installation: WDSClientUnattend.xml and AutoUnattend.xml.
Note: All of the components that we’ll add here are associated to 32bit images because the installation file selected has that architecture. If we uploaded a 64bit image, you’ll see an answer file with x64 components.
This is the first file used by WDS to respond to all the first configurations on the Windows PE: Disk partitions (creating, modifying) and selecting the image from WDS that we are going to install.
As you can imagine, all the components that we will add will go on Windows PE cycle:
Here’s an example of all the components and values that can be inserted on the answer file. On the values I have set in here I’m doing the following:
- Setup and keyboard language: English
- Delete and create a single NTFS partition on root disk.
- The partition will take the complete size of the HD.
- Label: system.
- Installation Group: VistaInstallation. This is the group that we created when we uploaded the image.
- File: Install.wim. This is the name of the installation file that we uploaded.
- Image Name: This is the description name that we used on the installation that we uploaded.
- User and domain name that will be used to login and choose available images. No higher privileges needed on the user.
To confirm that the answer file has no errors, click on Tools and Validate Answer File. All the errors will be displayed and explained on the lower panel.
Once the file is validated click on Save and use the name WDSClientUnattend. This file must be stored inside the folder of installation files created by default: /RemoteInstall/WDSClientUnattend.xml
This is the file that we are using to answer all the Vista configurations: Product key, computer name, domain joining, local users and passwords, etc.
The components that need to be added are the following:
Cycle 4: Specialize
Cycle 7: OobeSystem
Here’s an example of the values selected:
- ComputerName: If you set the value on â€œ*â€ the computer name will be selected randomly (using RegisteredOwner value.
- The user name inserted in â€œCredentialsâ€ will be the user that will join the computer to the domain. No higher privileges needed for this user, but remember a normal user can only join 10 computers to the domain.
- ProtectYourPc: If 1 establishes that the updates will be automatically downloaded and installed.
- LocalAccount: User added here will be created locally and in this example is also member of the Administrators group.
To confirm that the answer file has no errors, click on Tools and Validate Answer File. All the errors will be displayed and explained on the lower panel.
Once the file is validated click on Save and use the name AutoUnattend. You can locate this answer file on any folder.
Note: A good thing about System Image Manager is that for each component that you have here, you can access to the description of it. If you have any doubt on the values that you are placing, take a look to the help file.
Associating Unattended Files
Now that we have the both files necessary, all we need is to associate each file to the images we are going to install:
1. Associating WDSClientUnattend.
a) Open the WDS console on your server.
b) On the servers listed on the snap-in, right click on the server name and select Properties.
c) Open Client and select Enable Unattended Installations.
d) According to the architecture where you created the image, browse for the WDSClientUnattend.xml file.
2. Associating AutoUnattend
a) On the Installation Images, open VistaInstallation.
b) Right click on the image that you uploaded and select Properties.
c) Select the option Allow image to install in unattended mode.
d) Click on Select File and browse for AutoUnattend.xml.
And there you go; you have the complete environment to install full and unattended images of Windows Vista.
You have to remember that at the moment of booting a client machine, once you’ve selected the WindowsPE image to boot, the complete installation process will not require any user intervention. If you have to select on clients different disk options or partitions, you can deselect the option where you choose the WDSClientUnattend file; with this you’ll get to manually introduce any changes on image selection and disk management.
Common Issue on x64 images and WDS
I’ve encountered that there are some scenarios that the 64bits images are not available for selection on a deployment, even though you have correctly uploaded to the WDS Server and the client supports x64 architecture.
The problem is that when the client connects to the WDS Server (soon after you pressed F12 to boot from the network) it doesn’t notify that it is x64 compatible client. And for the server, the client is just x86 compatible and the rest of the images do not apply for it.
To solve it:
1. Open a cmd on the WDS Server
2. Insert WDSUTIL /set-server /architecturediscovery:yes
And the next time any client contacts the WDS Server, will first notify about if is x64 compatible.
Hope you find this guide useful.
May 27, 2008
I was very excited when I started to play around with the first beta versions of Windows Server 2008 and experiment with the latest security improvements. At first, I wanted to start with one of the more basics and important things on this new server: Active Directory.
Several improvements were made on security matters that we can find it related to Active Directory: Read-Only DCs, more group policies, auditing enhancements, etc. After installing a small lab to check all these features, I finally arrive to another important Active Directory matter: Backing up and Restoring Data from a Domain Controller.
I was pretty disappointed at first when I realized that there was no easy way to backup a system state from a Domain Controller. Even more disappointed when I couldn’t find out the way to schedule a system state backup! Well on this post I want to review the way to simply schedule a system state backup on a Domain Controller and maintain those backups by removing the old ones from the backup catalog.
a. A secondary hard drive on the domain controller. It cannot be a network drive.
The only storage point possible for backing up your server is using a secondary hard drive that can only be attached locally.
b. Having the Windows Server Backup feature installed.
The first thing that you must know to start backing up data from Windows Server 2008 is that the backup tool is not installed by default, like it was on Windows Server 2003 with ntbackup. To install it:
a. Open Windows Server Manager snap-in
b. Access Features section and click on Add a New Feature
c. Select Windows Server Backup including the sub-item “Command Line Tools”
i. This will also need Powershell
d. Click on Install.
Scheduling System State Backup
If you check the GUI of the Windows Server Backup you’ll see that there’s no way to backup the system state from there:
The only way to backup the system state using this tool is using the command line. So, to use this backup feature as a scheduled task, we are going to create a .bat file and schedule this batch file to run on our desired time (actually you can skip creating the .bat file, and just use task scheduler with the right parameters).
1. Open notepad and insert:
WBADMIN START SYSTEMSTATEBACKUP –backuptarget:e: -quiet
“e:” is your local hard drive where the backup catalog will be stored.
“-quiet”: is the parameter used to not ask for confirmation
2. Save it as a batch file. Like: systemstatebackup.bat
3. Open Task Scheduler and create a “New Task”. The task properties window will open.
4. On the “General” tab select:
a. “Run whether the user is logged or not”
b. “Run with the highest privileges”
5. On the “Triggers” tab, click on “New”:
Here is where you select how often the backup task will run. This is an example of a task running weekly:
6. On “Actions” click on “New” and select to “Start a Program” and browse the batch file you just created.
7. Click on “OK” and the schedule task is ready.
You can manually run this task on demand by right-clicking it and selecting “Run”.
This task to complete takes between 40 minutes to 1 hour (or even more), depending on the system state data (Active Directory, DNS, registry, certificates, etc).
This is the process running
I have the backup… but what a hell is this??
Probably your first impression on the backup won’t be the best:
You don’t have permissions to see the backup files at first
You don’t see a simple .bkf file as it was when you used ntbackup
The size of every backup (that means every time you run the task) is as much as the size of the system drive
After all that, maintaining those backups sounds a little bit hard to do, the backup hard drive will significantly increase in a few weeks and for sure, you won’t be feeling much comfortable if you just try to delete.
Keeping It Simple… and smaller
But not of these annoying things are here to just making our work a little bit hard and awkward. Besides from adding a new layer of security to our backups, it also the maintenance of the old backups will actually get simpler.
You can create a new scheduled task that will keep every week (if that’s your case) only the newer backups on your catalog:
WBADMIN DELETE SYSTEMSTATEBACKUP –backuptarget:e: -deleteOldest –quiet
This way you will prevent from the backup hard drive to easily increase enormously. A good thing to keep in mind if you are working with virtual machines, you’ll probably know that it’s REALLY annoying having a big size virtual disk, and not being able to decrease their size (not an easy way anyhow).
This is the cmd running and deleting an old backup from system state (without the -quiet parameter).
To restore an Active Directory using these backups is not very much different from backing it up, we can see that procedure on a next post.
Hope it helps!
May 11, 2008
Ok then, after completing the first configurations made on the Part I of this guide we can perform a clean but attended network installation of Windows Vista.
There are two main steps to take and complete a full image and unattended deployment:
1. Creating the base image to deploy: OS, programs and other special configurations + uploading it to the WDS server.
2. Making an unattended file to be used with that image.
Creating the Base Image
Note: On this series of posts we are only considering to deploy Windows Vista or Windows Server 2008 images. The files used on WDS Native mode as unattended files are only valid to those operating systems, if you want to make unattended deployment with Windows XP or 2003 OS; you will need to use RIS or WDS Legacy Mode.
The first step it’s pretty simple, it consists on installing the operating system with all the features, programs and configurations that you want. But there are some considerations first:
After you complete the image, there’s a process where you release all the specific data involving the computer where it’s installed, like the Security Identifier (SID), computer name, etc. Here are some of the things that the image won’t keep after the release process:
· Computer name
· Owner and Company name
· Domain or workgroup membership
· TCP/IP Settings
· Regional and keyboard settings
· Specific hardware drivers. This refers to specific computer hardware, like video or audio drivers. But if you only applied drivers used on the Windows installation, the same will apply for the deployment, but any other external driver installed will be unavailable.
· Any saved network connections (wireless networks saved)
· OS product key. This is an important note, since no matter if your product has been activated; the key is reset after this process.
But here are some of the things that are kept after this release process:
· Programs and features installed (pretty obvious to say that at this moment right?)
· Local Users and Groups created.
· Product Keys used for programs installed. Meaning if you have Microsoft Office installed, the key used will remain as the same on the deployments.
· Windows updates installed
· User Profiles: Since all the profiles configuration are basically data stored on the Users folders, all that information will be uploaded within the image.
· Printers installed.
All the uploading process is made from the client side; but we must first prepare the WDS server to be ready to receive images.
First, we are going to add a boot image that will be specially to capture operating system images.
1. Go to WDS Console and let’s upload a second boot image; it can be the same that we added on the first post using the boot.wim from a Vista or Windows Server 2008 media.
2. Instead of naming it Windows PE, use a name like “Image Capture”.
3. After the process completes, right click on the image you just added and select “Create Capture Boot Image”
Now we have set our WDS server, let’s prepare the client using the sysprep tool and upload the image:
1. On the Vista or Windows 2008 client open a “cmd” as administrator and insert “cd c:\windows\system32\sysprep”.
2. Run “sysprep /oobe /generalize /reboot”.
This process will require for a few second and after it completes the OS will automatically reboot.
3. Soon as the machine is rebooting, press F12 to select different devices to boot.
4. Select to boot from the network card connected to the LAN
Now the client is communicating with the DHCP server to require an IP and a boot image, the DHCP will forward the request to the WDS. You will be prompted to press F12 one more time.
5. Since we have two boot images, let’s select “Image Capture”
The boot image will start to load.
6. A image capture wizard will start, click on “Next”
7. Now let’s select the volume we want to capture, in our case C:\. And put a name for the image that will be uploaded as long with a description.
It’s important to note that if the sysprep process did not completed properly no volume will be available to select.
8. On the next window you must select where the .wim file will be temporary stored locally. Select to keep it on the root C:\ (this file it’s not uploaded within the image).
9. Select the option “Load the image to a WDS server”; put the name of the server and click on “Connect”
10. You will be prompted with credentials, use a privileged account on the domain or local administrator account of the WDS server.
11. Now select the image group name where you want to store the new image and click on Finish.
Here the process of the image compression and preparation starts, this could take several minutes (~30 mins to ~1hr) depending on the image size and the hardware involved. After this process, the image is uploaded to the WDS server.
After it completes, check on the WDS console, the image should be uploaded and ready to be deployed.
Still we have not configured any unattended file, so the image can be deployed but the entire OS configuration should be entered manually, like on normal OS installation but all the programs will be installed.
For the unattended files preparation and configuration take a look to the third post of WDS.