Quantcast
Channel: Shared Points for SharePoint
Viewing all 20 articles
Browse latest View live

SharePoint 2010

$
0
0

The SharePoint conference has just started and the curtain has fallen :) It's time to start talking about SharePoint 2010.

Microsoft CEO, Steve Ballmer and Corporate Vice President SharePoint Server, Jeff Teper, kicked off the SharePoint conference in Las Vegas yesterday and unveiled a good bunch of information about the upcoming product.

Something to look forward to: Office 2010 and SharePoint 2010 will be available as public Beta in November.

Start looking into the new product:

Also see Jeff Teper's blog article from yesterday as part of the SharePoint 2010 disclosure : http://blogs.msdn.com/sharepoint/archive/2009/10/19/sharepoint-2010.aspx

 

 


SharePoint 2010 – Records Management

$
0
0

SharePoint 2010 introduces new capabilities regarding Records Management. The most interesting feature, I think, is In-Place Records management (info at the end of this post) which makes it possible to declare documents as records without moving them to a Records Center. Records and documents live side by side in the site they were created in, and SharePoint 2010 makes it possible to apply different policies (i.e. retention schedules) depending if the item is a record or a document.

The Record Center site template is still available. It utilizes some new functionality like the Content Organizer to route incoming documents to the libraries of your preference. A new “dashboard” gives a record manager faster access to common tasks and to-do actions.

Holds (with discovery), audit and file-plan reports are also new or improved functionality to better control your records.

Other items than documents can also be declared as records in SharePoint 2010; wiki pages, blog posts, article pages among others.

This post gives an overview of two capabilities; Records Center and In-place Record Management. Functionality like Holds, File-plan and other Records management features will be covered in later posts.

The functionality described in this blog post may be changed substantially prior to final commercial release of SharePoint 2010.

 

Create a Records Center site

Simply create a Records Center by selecting the correct site template:

image

When creating a Records Center in SharePoint 2010, the following features are enabled:

  • Content Organizer

  • E-mail Integration with Content Organizer

  • Hold and eDiscovery

  • Metadata Navigation and Filtering

  • Offline Synchronization for External Lists

  • SharePoint Server Enterprise Site features

  • SharePoint Server Standard Site features

  • Team Collaboration Lists

 

The new look of the SharePoint 2010 Records Center:

image

The “Submit a Record” button let users upload (and add metadata to) documents. The documents will be added to the “Drop Off Library” and then moved to the correct library/folder according to the rules added by the records manager.

Note: Activate the “In Place Records Management” feature on the Site Collection level to fully take advantage of the Records Center. The feature has to be activated to mark incoming documents as records:

image

Create / enable Content Types

The Content Organizer uses Content Types (and related metadata) as criteria for where to move incoming documents. Content Types must therefore be created (or enabled) before routing functionality are enabled.

Create libraries for storing Records

I created a document library; “Contracts 2009” and added a rule “Contract” to the Content Organizer. When documents are submitted, the Content Organizer will move any documents related to the Content Type “Document” into the “Contracts 2009” library.  You can create as many libraries/folders and content organizer rules you need to best control your records.

In my “Contracts 2009” document library, I went to Library Settings and enabled “Automatically Declaration”. Now, all documents added to the library are automatically flagged as records.:

image

Create Content Organizer rules

I added rules by selecting: Site Settings / Site Settings / Site Administration / Content Organizer Rules:

image

What happens if my target library doesn’t have the necessary Content Type enabled? I tried to create a rule which is using the Content Type “Dublin Core” as a criteria. I wanted all documents related to “Dublin Core” to be moved to the “Contracts 2009” library. As you can see from the message below, all target libraries also need necessary Content Types enabled for the Content Organizer to be able to move documents into the library.

image 

Retention Schedule

Retention Schedule is, per default, enforced on the Content Types, but it is possible to define retention schedules on library/folder level too.

Retention schedule using Content Types

I went to Site Action / Site Settings / Galleries / Site content types (on Site Collection level), then I clicked on the content type “Document”.

image

Then i clicked on “Information management policy settings”, checked the “Enable Retention” and was given the choice of running different retention schedules on Non-Records and Records:

image

I chose to use a different retention schedule, and clicked “Add a retention stage for records…”. I was given the choice of setting different actions when the event fires.

image

It’s possible to add multiple actions, and each stage will occur one after the other in the order they appear on the page:

image 

Retention schedule using Library/folder

Retention schedules are possible to configure directly on a document library. I went to my “Contracts 2009” library and selected “Document Library Settings. Under “Permission and Management” I clicked “Information management policy settings”. Here, I changed the source of retention by clicking on the “Change source”.

When choosing a different source a message will pop up, that basically says that you’re overriding the retention schedule at the Content Type level:

image

Then a form was presented, where I added my retention events and actions.

In-Place Records Management

A new capability in SharePoint 2010 is In-Place Records management. Instead of moving a document to a specific Records Center, you declare the document as a record and it will be handled as a record in the site it was created in. After the the document is declared as a record, it can have policies and restrictions different than when it was a document. The policies are added to either the Content Types or directly on the document libraries (see the Retention Schedule paragraph above).

Documents can be declared as records either manually or automatically.

Manual record declaration can be configures on Site Collection level and overridden in each document library. In Site Collection settings you have the following options on how declarations of records should be done:

image 

When Record Declaration Availability is set to “Available in all location by default”, a new icon appears on the Ribbon:

image

A document will get a padlock added to its icon when declared as a record:

image

Again, you can override the record declaration availability on the document library level:

image

Automatically declarations of records is possible by checking the “Automatic Declaration” option in the document library settings.

Document Management in SharePoint 2010

$
0
0

SharePoint 2010 has extended its Document Management (and Records Management) capabilities considerably. I have looked into the new capabilities and will give you a quick overview about some of them. This post explains the following functionality:

  • Document ID
  • Rating, Tag/Notes
  • Rule Based Submission

The functionality described in this blog post may be changed substantially prior to final commercial release of SharePoint 2010.

Document ID

At the Site Collection level you can enable use of Document ID’s:

image

Each document in the Site Collection will get their unique document number (see next screenshot). A document is addressable using the unique document id via a document redirector page (/DocIdRedir.aspx?ID=uniquedocid), and the unique document id can also be used as a property when searching.

Rating, Tag/Notes

I have enabled both versioning and forced check-out on my document library. Rating and adding Tag/Notes does not require a check-out, and users are able to easily rate and add notes to the document.

Just hover over the stars to add rating:

image

To add Notes, select the document and then click image on the Ribbon:

image

Save the notes by clicking “Post”.

Rule Based Submission

SharePoint 2010 let you define rules on a folder for further routing of the document. Documents uploaded to the folder will be moved to the correct library/folder based on the rules you apply. The rules are depended on the metadata you have available on the document.

The Content Organizer feature must first be enabled on the Site:

image

The feature creates a document library named “Drop off Library”:

image

After enabling the feature, go to Site Administration and click on “Content Organizer Rules”. A form pop’s up and this is where you add the routing criteria. I made a rule called “Contracts” and it will move all documents with the Content Type “Dublin Core” to my Contracts document library.

Page 1:

image

Page 2:

image

Additional settings are available under Site Administration / Content Organizer Settings:

image

(Note: the screenshot above doesn’t list all settings)

When users add a document to the “Drop Off Library”, the routing rules kicks in and move the document to the correct location.

I have uploaded a document into the Drop Off Library and the metadata form is presented. I select Dublin Core and fill in necessary metadata:

image

Then I select “Check In”, and the following pop-up is presented:

image

The routing rule then moves the document into the correct folder.

Other DMS capabilities

I will blog about the following capabilities later:

  • Document Sets
  • In Place Records Management
  • Hold and eDiscovery
  • Library and Folder Based Retention
  • Validate meta data upon submission of documents
  • Column default value settings
  • Office Web Apps
  • Term Store and Managed Metadata

How to publish a Managed Metadata Service for cross-farm consumption

$
0
0

Update: Modified the internal links in this post that didn’t work previously.

[Please keep in mind that this applies to a pre-release version of SharePoint Server 2010, and may change before the product is released]

Some of the service applications in SharePoint Server 2010 support sharing across SharePoint farms, as described in this article from TechNet. This post describes how to do it with a Managed Metadata Service.

To be able to subscribe to content types and terms from a Managed Metadata Service in another farm, there are a few things you need to do. Although this could seem like a somewhat cumbersome procedure for accomplishing something rather simple, it’s necessary for having control of your systems, and keeping them protected from intruders. Besides, you only have to do it once, and it really shouldn’t take you more than 10 minutes when you know what to do, which is exactly what I’m about to explain to you.

Most of the operations can be performed using SharePoint 2010 Central Administration, but I encourage you to use PowerShell instead, as this is a lot faster once you get used to it. There are six steps you need to follow, and I recommend doing them in the following order:

  1. Set up a trust relationship between the farms
  2. Set up Application Discovery and Load Balancer Service Application permissions
  3. Publish Managed Metadata Service
  4. Connect proxy to Managed Metadata Service
  5. Add proxy to service connection group
  6. Set up Managed Metadata Service permissions

In this post, the publisher farm is the farm in which the Managed Metadata Service is running, while the consumer farm is the farm which will consume data from the publishing farm.

Before we begin, let me first briefly describe the environment we’ll be working with:

  • The publisher farm is called “Enterprise Services Farm”, and has a Managed Metadata Service application called “Enterprise Metadata Service”. This service application consists of a term store, and a content type syndication hub.
  • The consumer farm is simply called “Collaboration Farm”.

Although documentation for most of the necessary operations can be found on TechNet, I’m including the actual PowerShell commands to give you an example of how it’s done. Please note that commands with brackets like this <content> requires you to replace the brackets and content with what the content describes.

Ok, let’s get started!

Set up a trust relationship between the farms

For the server farms to be able to communicate, you need to set up a trust relationship between them. This enables the farms to know that a service request is actually coming from the farm it claims to be coming from, and also enables federated authentication of users potentially not present in both farms. Farm root certificates must be exchanged between the servers, and a STS certificate must be exported from the consumer and imported to the publisher. How to set up the trust relationship is described in detail on TechNet.

On the publisher farm, run the following commands to export the farm root certificate to c:\temp on the server:

$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export("Cert") | Set-Content C:\temp\EnterpriseServicesRootCert.cer -Encoding byte

Run the following to export the necessary certificates on the consumer farm to c:\temp on the server:

$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export("Cert") | Set-Content "C:\temp\CollaborationRootCert.cer" -Encoding byte

$stsCert = (Get-SPSecurityTokenServiceConfig).LocalLoginProvider.SigningCertificate
$stsCert.Export("Cert") | Set-Content "C:\temp\CollaborationSTSCert.cer" -Encoding byte

Copy the files from the c:\temp folder on the publisher farm to the c:\temp folder on the consumer farm and vice versa.

Run the following commands on the publisher farm to set up the trust relationship with the consumer farm:

$trustCert = Get-PfxCertificate "C:\temp\CollaborationRootCert.cer"
New-SPTrustedRootAuthority Collaboration -Certificate $trustCert

$stsCert = Get-PfxCertificate "c:\temp\CollaborationSTSCert.cer"
New-SPTrustedServiceTokenIssuer Collaboration -Certificate $stsCert

Finally, run these commands on the consumer farm to set up the trust relationship with the publisher farm:

$trustCert = Get-PfxCertificate "C:\temp\EnterpriseServicesRootCert.cer"
New-SPTrustedRootAuthority EnterpriseServices -Certificate $trustCert

Set up Application Discovery and Load Balancer Service Application permissions

The Application Discovery and Load Balancer Service Application, aka Topology Service, handles discovery of the farm’s service applications, providing other farms with the information necessary for them to be able to consume any of the farm’s published service applications (it also serves purposes inside the farm, but that’s not the subject of this post). The only supported rights for this service application is “Full Control”, which is what we’ll grant.

On the consumer farm, run the following command to get the id of the consumer farm:

(Get-SPFarm).Id

Copy the Id output from this command, and run the following command on the publisher farm:

$security = Get-SPTopologyServiceApplication | Get-SPServiceApplicationSecurity

$claimProvider = (Get-SPClaimProvider System).ClaimProvider

$principal = New-SPClaimsPrincipal -ClaimType "http://schemas.microsoft.com/sharepoint/2009/08/claims/farmid" -ClaimProvider $claimProvider -ClaimValue <farmid from previous command>

Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights "Full Control"

Get-SPTopologyServiceApplication | Set-SPServiceApplicationSecurity -ObjectSecurity $security 

Publish Managed Metadata Service

To enable content types and terms to be accessible from outside the farm, the Managed Metadata Service must be published for outside consumption. How to publish the service application is described in detail in this article.

On the publisher farm, run the following command to publish the Managed Metadata Service called “Enterprise Metadata Service”:

Publish-SPServiceApplication (Get-SPMetadataServiceApplication “Enterprise Metadata Service”)

Connect proxy to Managed Metadata Service

Next, we need to create a proxy for the “Enterprise Metadata Service” in the consumer farm. On the publisher farm, run the following command to get the URI of the “Application Discovery and Load Balancer Service Application” (which will provide the consumer farm with information about the “Enterprise Metadata Service”):

Get-SPTopologyServiceApplication

Copy the LoadBalancerUrl from the output of the previous command to the consumer farm, and then run the following command to get the URI of the Enterprise Metadata Service application, and create a local proxy for it:

New-SPMetadataServiceApplicationProxy -Name “Enterprise Metadata Service Proxy” –URI (Receive-SPServiceApplicationConnectionInfo -FarmUrl <LoadBalancerUrl from the previous command> | Where {$_.Name -eq "Enterprise Metadata Service"}).Uri

Add proxy to service connection group

Run the following command on the consumer farm to add the new proxy to the default proxy group:

Add-SPServiceApplicationProxyGroupMember (Get-SPServiceApplicationProxyGroup -default) -Member (Get-SPMetadataServiceApplicationProxy "Enterprise Metadata Service Proxy")

The result of this is that all web applications using the default proxy group will use the “Enterprise Metadata Service Proxy” too.

Set up Managed Metadata Service permissions

Finally, to allow the consumer farm to connect to the Managed Metadata Service on the publisher farm, you have to grant the consumer farm permissions to the service application. The Managed Metadata Service supports three permissions: ”Read Access to Term Store”, “Read and Restricted Write Access to Term Store” and “Full Access to Term Store”. In this example, we’ll grant the consumer farm the least permissions, “Read Access to Term Store”:

On the consumer farm, run the following command to get the id of the farm:

(Get-SPFarm).Id

Copy the outcome to the publisher farm, and then run the following commands there:

$security = Get-SPMetadataServiceApplication "Enterprise Managed Metadata Service" | Get-SPServiceApplicationSecurity

$claimProvider = (Get-SPClaimProvider System).ClaimProvider

$principal = New-SPClaimsPrincipal -ClaimType "http://schemas.microsoft.com/sharepoint/2009/08/claims/farmid" -ClaimProvider $claimProvider -ClaimValue <farmid from previous command>

Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights "Read Access to Term Store"

Get-SPMetadataServiceApplication "Enterprise Metadata Service" | Set-SPServiceApplicationSecurity -ObjectSecurity $security

That’s it, you’re up and running with a Managed Metadata Service that is shared between the two server farms!

The only thing left to do is to decide what you want to consume through the proxy, and configure it accordingly. You can learn more about this on TechNet.

Finally, I’d like to share some error messages I’ve received related to the tasks described here, and what was their resolution in my case, to ease your troubleshooting:

  • If you try to manage your Managed Metadata Service Proxy on the consumer farm from the SharePoint 2010 Central Administration and get the error message “The Service Application being requested does not have a Connection associated with the Central Administration web application. To access the term management tool use Site Settings from a site configured with the appropriate Connection”, this could be because your proxy has not been added to a proxy group. See Add proxy service to connection group.
  • Managed Metadata Service Proxy on the consumer farm from the SharePoint 2010 Central Administration and get the error message “The Managed Metadata Service or Connection is currently not available. The Application Pool or Managed Metadata Web Service may not have been started. Please Contact your Administrator”, this could be because your farm doesn’t have the right permissions to the Managed Metadata Service. See Set up Managed Metadata Service permissions.

Debugging SharePoint – Web application could not be found Issue

$
0
0

Sometimes when I debug my SharePoint code from Visual Studio, I get the exception:

System.IO.FileNotFoundException was unhandled
  Message=The Web application at
http://jl-sp2010/ could not be found. Verify that you have typed the URL correctly. If the URL should be serving existing content, the system administrator may need to add a new request URL mapping to the intended application.
  Source=Microsoft.SharePoint
  StackTrace:
       at Microsoft.SharePoint.SPSite..ctor(SPFarm farm, Uri requestUri, Boolean contextSite, SPUserToken userToken)
       at Microsoft.SharePoint.SPSite..ctor(String requestUrl)
       at SPConsoleApplication_SPRouterRule.Program.Main(String[] args) in C:\Install\SPConsoleApplication_SPInstall.Solution\SPConsoleApplication_SPRouterRule\Program.cs:line 33
       at System.AppDomain._nExecuteAssembly(Assembly assembly, String[] args)
       at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly()
       at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
       at System.Threading.ThreadHelper.ThreadStart()
  InnerException:


Most often this is due to one of three things:

  • Must run as administrator
  • Wrong target framework
  • Wrong target application

 

Run as administrator

The Visual Studio instance must be started as “Run as administrator”.

image

Target framework: .NET Framework 3.5

Visual studio creates a new console application with the setting  .NET 4.0 . For SharePoint you need to set it to .NET Framework 3.5

image

Platform target: Any CPU

Visual studio creates a new console application with the setting  x.86 . For SharePoint you need to set it to Any CPU

image

Script a Metadata Service Application using PowerShell

$
0
0

Scripting the deployment of service applications is a good idea; it is predictable, repeatable and gets you control over database names. I found a good script for the Metadata Service Application at Zach Rosenfield’s blog post SP+PS 2010: PowerShell to Create a Service Application.  Read Zach’s blog post for a detailed description of the script.

Here is the script, adjusted with RTM changes:

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
054
055
056
057
058
059
060
061
062
063
064
065
066
067
068
069
070
071
072
073
074
075
076
077
078
079
080
081
082
083
084
085
086
087
088
089
090
091
092
093
094
095
096
097
098
099
100
101
102
103
104
105
106
107
108
109
110
111
112
113
Set-ExecutionPolicy unrestricted

# -----------------------------------------------------------------------------
# Script for setup managed metadata service
# -----------------------------------------------------------------------------

write-host -ForegroundColor Green "setup managed metadata service...";


$FarmName = "Lab"
$DatabaseServer = "LabDB"
$DatabaseUser = "lab\spfarmservice"
$DatabaseUserPassword = (ConvertTo-SecureString "pass@word1" -AsPlainText -force)
$DatabaseCredentials = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $DatabaseUser, $DatabaseUserPassword 

$ServiceApplicationUser = "lab\SpServiceApp"
$ServiceApplicationUserPassword = (ConvertTo-SecureString "pass@word1" -AsPlainText -force)
$ServiceApplicationCredentials = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $ServiceApplicationUser, $ServiceApplicationUserPassword 

$ServiceApplicationPool = "SecurityTokenServiceApplicationPool"
$MetadataDatabaseName = ("{0}_Metadata" -f $FarmName)
$MetadataServiceName = "Managed Metadata Service"

try{

      #Managed Account
      write-host -ForegroundColor Green "Managed account...";
      $ManagedAccount = Get-SPManagedAccount $ServiceApplicationUser

      if ($ManagedAccount -eq $NULL) 
      { 
           write-host  -ForegroundColor Green "Create new managed account"
           $ManagedAccount = New-SPManagedAccount -Credential $ServiceApplicationCredentials
      }

     

      #App Pool
      write-host -ForegroundColor Green "App pool...";

      $ApplicationPool = Get-SPServiceApplicationPool  "SharePoint Hosted Services" -ea SilentlyContinue

      if($ApplicationPool -eq $null){ 

            write-host  -ForegroundColor Green "Create new app pool"
            $ApplicationPool = New-SPServiceApplicationPool "SharePoint Hosted Services" -account $ManagedAccount 

            if (-not $?) { throw "Failed to create an application pool" }

      }

     

      #Create a Taxonomy Service Application
      write-host -ForegroundColor Green "Create a Taxonomy Service Application...";

      if((Get-SPServiceApplication |?{$_.TypeName -eq "Managed Metadata Service"})-eq $null){     

            Write-Progress "Creating Taxonomy Service Application" -Status "Please Wait..."
            #get the service instance

            $MetadataServiceInstance = (Get-SPServiceInstance |?{$_.TypeName -eq "Managed Metadata Web Service"})

            if (-not $?) { throw "Failed to find Metadata service instance" }

           

             #Start Service instance
            Write-Progress "Start Service instance" -Status "Please Wait..."

            if($MetadataserviceInstance.Status -eq "Disabled"){ 

                  $MetadataserviceInstance | Start-SPServiceInstance 

                  if (-not $?) { throw "Failed to start Metadata service instance" }

            } 

            #Wait
            write-host -ForegroundColor Yellow "Waiting for Metadata service to provision";

            while(-not ($MetadataServiceInstance.Status -eq "Online")){ #wait for provisioning

                  # write status
                  write-host  -ForegroundColor Yellow -NoNewline $MetadataServiceInstance.Status; sleep 5;
                  write-host  -ForegroundColor Yellow -NoNewline "."; sleep 5;

                  # Update the object
                  $MetadataServiceInstance = (Get-SPServiceInstance |?{$_.TypeName -eq "Managed Metadata Web Service"})
            }

            #Create Service App

            $MetaDataServiceApp  = New-SPMetadataServiceApplication -Name "Metadata Service Application" -ApplicationPool $ApplicationPool  -DatabaseServer $DatabaseServer -DatabaseName $MetadataDatabaseName
            if (-not $?) { throw "Failed to create Metadata Service Application" }

           

            #create proxy

            $MetaDataServiceAppProxy  = New-SPMetadataServiceApplicationProxy -Name "Metadata Service Application Proxy" -ServiceApplication $MetaDataServiceApp -DefaultProxyGroup

            if (-not $?) { throw "Failed to create Metadata Service Application Proxy" }

      }

}

catch
{
            Write-Output $_ 
}

Installing SharePoint 2010 prerequisites offline

$
0
0

I use my private server for different types of testing, it’s a really a try & fail environment. Some days back I created some new VM’s (I am running a Windows Server 2008 R2 host with Hyper-V enabled) and created an isolated private network that will run SharePoint Server 2010. As i had no problems setting up a new Domain and SQL server (by the way there is an excellent TechNet article which describes how to set up a virtual ADDC), I experienced issues with the SharePoint Server 2010 Prerequisites Installer. As usual I started the installer, this fantastic little app doing the necessary boring stuff on your server making it ready for SharePoint 2010 :). It started off normally but stopped mid-way with a message that it failed on downloading the MS SQL 2008  Analysis Services ADOMD.NET component. Well, no wonder why, running VM’s in a private network makes them Private, no access to the physical network off course. I had to install the prerequisite offline.

The easiest will of course always be to install SharePoint 2010 on a internet connected server where the Prerequisites Installer downloads the necessary stuff. But what if you had a private network as mentioned or just that you wanted to have more control over the software that’s being installed, which version and so on. With a little more manual effort you can install the prerequisites offline.

There is an explanation in this TechNet article, but I have also made a few points you can follow:

1) If you don’t have the local copies of the prerequisites already, go get them. From a internet connected computer you can download them from this TechNet article. Then they

    have to be taken to a server that is reachable from your private network. (Lazy me just packed the files as an ISO image that i easily can mount any VM).

2) Now, from the command line, find the PrerequisitesInstaller.exe on the root of your SharePoint 2010 media disk.

3) Run the command PrerequisitesInstaller.exe /?  (which gives you the following message)

image

As you can see it lists the different command options and most important the different switch's. A switch is a specific prerequisite identifier which represent an configuration/update for your server.

For example, as described in the screenshot above, the SQLNCli switch represents Microsoft SQL Server 2008 Native Client while the ChartControl switch represents the Microsoft Chart Controls for Microsoft .NET Framework 3.5.

Giving the desired switch you have to enter the path to the software stored locally or on a network share in the following way, i.e: PrerequisitesInstaller.exe /SQLNCli:\\<path to file> which in this case sets the installation path for the MS SQL Server 2008 native client.

If you need to install all prerequisites this way, you can enter several switch’s and their network path in pair and separate them by a space, like this:

Prerequisitesinstaller.exe /SQLNCli:\\<path to file> /ChartControl:\\<path to file> /PowerShell:\\<path to file>….and so on.

Note that the prerequisites installer will only install the prerequisites your OS needs. If the prerequisite is not applicable to your OS it will ignore it. It will of course also check if you have the necessary prerequisite already and if so skip the installation of the specified prerequisite. At last, if there is a prerequisite you need but you don’t specify it the installer will try to download from internet and install which in our scenario will fail because of no internet connection (It will generate an error similar as shown below).

image 

4) In my environment and as shown above, the MS SQL 2008 AS ADOMD.NET installation failed so in this case I entered the following command giving the correct switch and file path:

    PrerequisitesInstaller.exe /ADOMD:\\adds\sp2010prerequisites\SQLSERVER2008_ASADOMD10.msi

   

5) This time the Prerequisites Installer went trough just fine and I got this successful message

image

 

Note: It is also possible to install prerequisites using an arguments file. Please see this TechNet Article section for details. 

Locking Down SharePoint Designer 2010

$
0
0

If you previously worked with Microsoft Office SharePoint Server 2007 you probably know a lot about SharePoint Designer already. Probably one big complaint, at least from the operational side for MOSS 2007 was that SharePoint Designer 2007 could be used so long down the road by the end user that It could break the site that it no longer worked. There was locking possibilities for SharePoint Designer 2007 but, for example, one option was to change the ONET.XML file which you probably not wanted to. The SharePoint Designer Team blogged about these locking possibilities a while back.

As for SharePoint Server 2010 a number of enhancements have helped this feature. 

All my SharePoint developer environments include SharePoint Designer. It was and is a quite powerful tool. When I started to work with the new version of SharePoint Designer it was really enjoying to see all the improvements that was done. A new tool in my eyes and a lot easier to use. And just to be clear, the 2007 edition is not compatible with SharePoint Server 2010, you have to use the new SharePoint Designer 2010. But as the 2007 edition, SharePoint Designer 2010 is downloadable for free;

One of the several improvements in SharePoint 2010 is the possibility to lock down SharePoint Designer usage. You can now prevent or set restriction on what users can do with the tool either at Site Collection level or at Web Application level.

 

Setting restrictions at Site Collection level

In the Site Settings page you will find the SharePoint Designer Settings menu. (Available for Site Collection administrators).

image

Selecting this will give you the following possibilities at the Site Collection level:

image

As shown in the image above, SharePoint Designer is enabled by default but has additional settings available.

Looking at the different settings the first one is quite ok, unselecting this will disable all usage of SharePoint Designer without the Site Collection administrators. This setting is enabled by default. The second setting is quite interesting, you can now control wherever your users should be allowed to detach pages from the Site Definition also known as unghosting. This is good, users are now prevented from creating unghosted pages which in the long run can be a pain. The third setting is about enabling users to customize master pages and page layouts which controls the default look and feel and the fourth and last setting is about if the users should see the hidden URL structure or not, such as the _catalogs folder.

Some quick testing;

If we enable management of the hidden URL structure this will be available from SharePoint Designer 2010 as shown below

image 

Allowing customization of Master Pages and Page Layouts, makes the page objects visible

image

Note: This setting only allow light changes to the pages. if you need to do advanced customizations you have to enable the setting ‘Enable detaching pages from the Site Definition’ as well.

Last, if we completely disallow all usage of SharePoint Designer and then try to open a site in the collection using SharePoint Designer 2010, users are presented the following message;

image

Note: Site Collection administrators will still be able to use SharePoint Designer 2010. They are not affected by the settings at this level.

 

Setting restrictions at Web Application level

If you open SharePoint 2010 Central Administration you will find a SharePoint Designer section in the General Application Settings menu.

image

Opening this you are presented the following settings;

image

As you can see from the image above, SharePoint Designer settings are enabled by default at Web Application level. Adjusting these settings will set the restrictions of SharePoint Designer 2010 on what you choose at a specified Web Application. Even the Site Collection administrators are affected by these settings.

Testing this, disabling all possible settings in SharePoint Central Administration and then turning back to our site, the available settings at site collection level are disabled as shown below.

image


Sending files to a record center using the SP2010 webservice Officialfile.asmx

$
0
0

When you send a document to the record center through  the web service officialfile.asmx there is a way to retain and even add more metadata to the file.

In our case, we wanted to use the same content type on the team sites as inn the record center, but with some additional columns in the record center. Since all our content types are pushed out through a Managed Metadata service, they are easy to maintain and update. But we felt a bit uncomfortable if we had to expose metadata fields only meant for the record center in all our team sites. So how could we add them?

It turned out we could use “RecordsRepositoryProperty” to accomplish this. In the code below, I generate a RecordsRepositoryProperty array of all the fields to the SPListItem we are about to send to the webservice and then I add another two “custom” properties. (You can of course add as many or few as you would like.)

               SPListItem item = workflowProperties.Item;
               SPFile file = item.File;
               byte[] bytes = file.OpenBinary();
               string fileType = file.Item.ContentType.Name;
               SPFieldCollection fields = file.Item.Fields;

//I’m adding two additional properties, hence the +2
OfficialFile.RecordsRepositoryProperty[] properties = new OfficialFile.RecordsRepositoryProperty[fields.Count + 2];               GenerateRecordsRepositoryArray(file, fields, properties);

               System.Configuration.AppSettingsSection s = GetWebConfigSiteConfigurations();

               OfficialFile.RecordsRepository sv = new OfficialFile.RecordsRepository();
               string recordCenterUrl = s.Settings["RecordCenterUrl"].Value;
               sv.Url = recordCenterUrl + "/_vti_bin/officialfile.asmx";
               string UserName = s.Settings["RecordCenterLogInUserName"].Value;
               string sDomain = s.Settings["RecordCenterLogInDomain"].Value;
               string PassWord = Password; //You should use some encryption here…

               CredentialCache credentialCache = new CredentialCache();
               NetworkCredential credentials = new NetworkCredential(UserName, PassWord, sDomain);

               credentialCache.Add(new Uri(recordCenterUrl), "NTLM", credentials);
               //credentialCache.Add(new Uri(recordCenterUrl), "Negotiate", credentials); //For some reason or another, this would not work?
               sv.Credentials = credentialCache;

               string fullUrl = recordCenterUrl + string.Format("{0}/{1}", item.ParentList.ParentWebUrl, item.File.Url);

               SubmitStatus = "Sending";
               SubmitStatus = sv.SubmitFile(bytes, properties, item.ContentType.ToString(), fullUrl, file.ModifiedBy.LoginName);
               LogUtility.LogVerbose(string.Format("Result code from submitting {0} to the record center is: {1}.", file.Title, SubmitStatus), traceCategory);

 

private void GenerateRecordsRepositoryArray(SPFile file, SPFieldCollection fields, OfficialFile.RecordsRepositoryProperty[] properties)
    {
        // Create a RecordsRepositoryProperty for each metadata field.
        int i = 0;
        foreach (SPField field in fields)
        {
            try
            {
                string value = Convert.ToString(field.GetFieldValue(Convert.ToString(file.Item[field.Title])));
                LogUtility.LogVerbose(string.Format("Name:{0}, Type:{1}, Value:{2}", field.Title, field.TypeAsString, value), traceCategory);
                OfficialFile.RecordsRepositoryProperty property = new OfficialFile.RecordsRepositoryProperty
                {
                    Name = field.Title,
                    Type = field.TypeAsString,
                    Value = value
                };
                properties[i] = property;
            }
            catch (Exception exception)
            {
                LogUtility.LogError(string.Format("Error - Failed to process field {0}", field.Title), exception, traceCategory);
            }
            i++;
        }

        //Add the two custom "field", in order for the content organizer rules to pick them up.
        OfficialFile.RecordsRepositoryProperty property1 = new OfficialFile.RecordsRepositoryProperty
        {
            Name = FolderIDField,
            Type = "Text",
            Value = FolderID
        };
        properties[i] = property1;
        OfficialFile.RecordsRepositoryProperty property2 = new OfficialFile.RecordsRepositoryProperty
        {
            Name = SubFolderTitleField,
            Type = "Text",
            Value = SubSiteTitle
        };
        i++;
        properties[i] = property2;
    }

 

These properties turn up in the record center’s drop off library in a column called “original properties”. So how can this be useful? It is really quite easy: If you have a column (on the record centers content type) whose name is the same as the RecordsRepositoryPropery’s name you are sending over, SharePoint will match them for you!

So in our case, we have changed the content types in the record center only, by adding two more columns. We are sending RecordsRepositoryProperties over and matching the names. Then we can even use Content organizer rules to move the items along to the records library!

Help! - the business requires me to install third party addons into SharePoint 2010

$
0
0

There are a ton of third party addons to SharePoint ranging from useful to just fun, from open source communities or commercial providers. The business value from the addon can be very good, but you need to make sure there are no hidden cost. What if the addon introduce:

  • Performance and/or stability issues?
  • Increased operations cost?
  • Elements that take your solution in an unsupported state?
  • Increased upgrade cost?

One of the key questions you need to ask yourself is "What do I need to do to be able to support this?"

How your company handle third party addons and code must be described in your governance plan. With the governance plan you have a tool for mitigating the risks. If you don't care about the possible issues, so just write that in the governance plan (not a good idea though). The governance plan is unique for each company. Keep it short and follow up on what you decides on. Most of the value is in the work itself to create the governance plan.

 

What to look after and require?

Here are some guidelines and recommendations to help set up a plan for handling 3rd party addons in SharePoint:

  • In general do not customize existing SharePoint files in the 14 hive. Except for a few well-defined files such as the web.config or docicon.xml files, you should never modify the built-in files included with SharePoint Products and Technologies except through official supported software updates, service packs, or product upgrades.
  • Artifacts should be packaged in a SharePoint solution (WSP file)
  • web.config configuration should be managed using SPWebConfigModification class to provide consistent deployment across Web Front Ends (where possible! There are configuration entries where it is not possible to use this class)
  • Demand reasons from developer for putting assemblies in the GAC (there might be good reasons such as event handler code, or other code that need to go in GAC or that code should be reached across WFE)
  • Depending on company policy you could require that code is decorated with CAS attributes to allow lowest possible CAS settings (dont accept code that demand Full trust, then it might as well be in the GAC!)
  • Make sure WSP files are documented -just a few lines to tell what the solution contains and what it does (makes it easier when you need to debug what caused your farm to become unstable)
  • Make sure dependencies between WSP files are documented
  • Require that upgrade scenarios have been thought into the features you are about to install (often this will include callout code that handles feature deactivation)
  • Create a baseline performance indication of your farm that you update each time you deploy new solutions and features to monitor performance degradation
  • Use batch files or PowerShell scripts to deploy your packages to make deployments consistent across environments (dev-test, integration-test, pre-prod, prod)
  • Demand your code as a bare minimum is unit tested (also consider functional testing, load testing..)
  • Developers should run SPDisposeCheck.exe as part of their release builds
  • Don’t accept debug mode build assemblies!
  • Suggest developers do code review before releasing code

 

Separate application farm?

In some cases the risk can be mitigated by running the third party addon in a separate farm (application farm). That way it will not negatively affect the other SharePoint solutions. An application farm has a cost, doesn't suite all 3rd party addons, and complicates worldwide/global installations.

Information harvested from these resources and experiences:

Connecting custom service application proxies using Powershell

$
0
0

In my current project we have scripted a NewsGator install in a multi-server farm. One of the challenges we faced is that we needed to connect a proxy for the NewsGator Social Platform Services service application. This is very straight forward for the out-of-the-box service applications in SharePoint, as each have specific Powershell cmdlets:

New-SPSubscriptionSettingsServiceApplicationProxy
New-SPBusinessDataCatalogServiceApplicationProxy
New-SPMetadataServiceApplicationProxy
New-SPSecureStoreServiceApplicationProxy
New-SPStateServiceApplicationProxy
New-SPEnterpriseSearchServiceApplicationProxy
New-SPPerformancePointServiceApplicationProxy
New-SPProfileServiceApplicationProxy
New-SPVisioServiceApplicationProxy
New-SPWebAnalyticsServiceApplicationProxy

For custom service applications it’s a little bit more complicated. By the way, I most cases you need to set up a trust relationship between the servers before trying to connect the proxy. This also applies to the out-of-the box proxies. The next step is to get the connection URI for the service application. You can use the topology web service for that:

$connectionUri = (Receive-SPServiceApplicationConnectionInfo `
    -FarmUrl
http://publisher:32844/Topology/topology.svc

Next, loop through all the service proxies to find the right proxy type. Notice that we need to look at each service proxy through the eyes of the IServiceProxyAdministration interface to get access to the GetProxyTypes method.

foreach($sp in (Get-SPFarm).ServiceProxies)
{
    $spa = $sp -as `
     
[Microsoft.SharePoint.Administration.IServiceProxyAdministration]
    $types = $spa.GetProxyTypes()
    $type = $types | ? { $_.FullName –eq $ProxyTypeName }
    if ($type –ne $null) { break }

}

Finally, create and provision the proxy:

[Microsoft.SharePoint.Administration.SPServiceProvisioningContext] `
    $provisioningContext = $null
$serviceApplicationProxy = $serviceProxyAdministration.CreateProxy( `
    $type, "NG Proxy", $connectionUri, $provisioningContext)
$serviceApplicationProxy.Provision()

(Using null for the provisioning context parameter seems to work fine, so I didn’t spend more time trying to figure out what that value should be.)

You can now verify that the service proxy is available in Central Administration | Manage service applications.

NOTICE! To make this post more readable all error handling has been removed from the scripts.

Configure redirect via IIS using command line script

$
0
0

In our current project, we were moving an old SharePoint installation to a new farm, As part of this migration, we got a new, nicer DNS name. We wanted to support old users who had bookmarks etc., by doing a redirect. First of all, we configured both DNS’s to point to the same IP address, like this:

  • long-old-url.company.com –> 10.10.10.10
  • nice-url.company.com –> 10.10.10.10

This way, both new and old users could access the same backend transparently. But, we wanted the old users to be redirected to the new URL, so that going forward everyone would see “http://nice-url.company.com” in their browser address bar. Fortunately, version 7 of IIS added support for redirects. But it turns out that the redirect module is not enabled by default. You must go into the “Windows features” to enable it.

Once the redirect feature was enabled, it was easy enough to configure the redirect using the UI.

 

UPDATE:
I though it was this easy, but it turns out that query parameters are not passed along when using the straightforward redirect function, but you need to add $V$Q and check the "Exact destination", like this:

 

As we have multiple environments were this needs to be configured, we wanted to do it all via the command line. It took some research to figure out the steps to activate the windows feature and even more to do the IIS configuration via the command line. So, for your convenience, here is a complete batch script which takes as input four parameters and sets it all up in one go. The four parameters are:

  1. Old URL
  2. New URL
  3. Old port (defaults to 80)
  4. New port (defaults to 80)

And here is the complete script with inline comments to explain the required steps:

@echo off

REM Configuration
set oldUrl=%1
set newUrl=%2
if "%3" == "" (set oldPort=80) else (set oldPort=%3)
if "%4" == "" (set newPort=80) else (set newPort=%4)

set siteId=1001
set siteName=%oldUrl%
set virtualDir=c:\inetpub\wwwroot\wss\VirtualDirectories\%oldUrl%-%oldPort%

REM ************************************************
REM Enable HTTP Redirect feature
REM ************************************************
dism /online /Enable-Feature /FeatureName:IIS-HttpRedirect

REM ************************************************
REM Create virtual directory for site
REM ************************************************
mkdir "%virtualDir%"

REM ************************************************
REM Create site
REM ************************************************
%windir%\system32\inetsrv\AppCmd ADD SITE /name:%siteName% /id:%siteId% /bindings:http/*:%oldPort%:%oldUrl% /physicalPath:"%virtualDir%"

REM ************************************************
REM Add redirect setting to web.config
REM ************************************************
%windir%\system32\inetsrv\AppCmd SET config %siteName% /section:httpRedirect /enabled:true /destination:http://%newUrl%:%newPort%$V$Q /exactDestination:true

REM ************************************************
REM Grant read access to Everyone recursively
REM (Automatic when done via UI, but not via cmd)
REM ************************************************
icacls "%virtualDir%" /t /q /grant:r everyone:(GR)

ConfigureIISRedirect.cmd.txt

Troubleshooting SharePoint logging

$
0
0

Both the Event Log and the ULS log are invaluable sources of information when monitoring or debugging your SharePoint solution. Sometimes logging doesn't work. In my experience these are the most frequent reasons:

  1. The log disk is full.
  2. The application pool account of your web application is not a member of the  "Performance Log Users" group on ALL servers in the farms.
  3. For being able to create custom event logs from an event receiver, the timer service account needs write permissions to the following key in registry:

    HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Servicese\Eventlog

If you are just starting to implement logging in your application, a few tips may help you on the way:

How to speed up SharePoint Management Shell startup

$
0
0

On virtual machines without internet access it may take 1-2 minutes to start SharePoint Management Shell. The problem is that when loading signed assemblies the .NET Framework checks the Internet based certificate revocation list. The timeout for each check is 30 seconds. To avoid this check, set the following registry key:

[HKEY_USERS\<userid>\Software\Microsoft\Windows\CurrentVersion\WinTrust\Trust Providers\Software Publishing]
"State"=dword:00023e00

Incorrect syntax near the keyword ‘AS’

$
0
0

The error "Incorrect syntax near the keyword 'AS'" can occur when you try to run SQL scripts that require SQL Server 2008 on an earlier version of SQL Server. In my case it occurred while installing NewsGator Social Sites. SharePoint 2010 supports SQL Server 2005 (in some scenarios), but NewsGator requires SQL Server 2008 SP1.


Removing an invalid WebConfigModification using Powershell

$
0
0

SharePoint supports adding web.config modifications programmatically. Here is an example from MSDN. Each modification is stored with a web application, and you are able to list them using the following command:

$webapp = Get-SPWebApplication http://<your web app url>
$w.WebConfigModifications

If you want to remove an invalid entry, use the following commands:

$modification = $webapp.WebConfigModifications | where { $_.Name -eq "<the name of your key>" }
$webapp.WebConfigModifications.Remove($modification)
$webapp.Update()

How is the link to people’s My Site from Outloook provided?

$
0
0

In Lync and Outlook you can hover over a contact to get a popup showing ways to interact with the person. This is what it looks like in our environment:

image

When hovering over the persons name, you should get the option to view the persons profile page, or “My Site”. But in a standard installation, this connection is not enabled, and you will only be able to highlight the name, e.g. as shown here:

image

In an installation with a proper MySite-link, it will look like this when I hover over “Thomas Svensen” in the popup:

Link to My site

And clicking the name will open the person’s MySite in your default browser. Worth noting here is that there is not a direct link to the user’s MySite, instead the My Site that you are connected to, is queried using the email address of the person in question. This can be verified by using Fiddler to show the trace of network calls made.

This link to the user’s MySite is stored in the Registry:

[HKEY_CURRENT_USER\Software\Microsoft\Office\14.0\Common\Server Links\Published\My Site\Profile Site]
"LinkType"=hex(b):00,00,00,00,00,00,00,01
"IsMember"=dword:00000000
"IsPublished"=dword:00000001
"Url"="http://my:80/Person.aspx?user="

I have highlighted “My Site” in the registry path, because this name can be anything. Testing has shown that the first entry listed under “Published” is used, whatever the name is. With SharePoint 2010, the only way we have found to create this link in the registry, is to:

  1. Open some document library
  2. Click “Library tools > Library” in the ribbon
  3. Click “Connect to Office > Add to SharePoint sites”
  4. Click “OK” in the dialog that appears.

image

This procedure is not really something we can instruct all our thousands of users to do, of course. There are some suggestions floating around on the net, saying that the PortalConnect popup that appeared in earlier SharePoint version can be enabled for SP2010, e.g. as described by Stef van Hooijdonk. But applying those registry updates didn’t not change anything for us; neither does it seem to work for others on the SharePoint TechNet forum. The obvious alternative is pushing out the registry update shown above via a GPO – Group Policy Object, but there is no official guidance on this, e.g. in the Office 2010 Group Policy and OCT Settings reference spreadsheet. We are trying to get some guidance, but currently, this is the state of affairs.

Deploying search synonyms to SharePoint Search

$
0
0

SharePoint 2010 Search enables you to add search synonyms via XML configuration files on the server. The configuration is well described on TechNet and seems pretty straighforward. There are sample files in the installation, that you can use as a starting point, but straight away, we saw some challenges related to governance and deployment:

  • Manual editing of XML files only works for really small files
  • Deployment is cumbersome - it must be done on each server, and the path includes GUIDs

We therefore decided to script the rollout, taking a CSV file as input. CSV is easily edited using Excel, and PowerShell is great at parsing CSV and writing XML. This way, both problems could be easily solved. But, as we started trying this out, we ran into additional challenges:

  • Adding anything except letters, numbers and spaces seem to breaks the feature
  • In some environments, the reference to the XML schema causes problems
  • Duplicate synonym entries makes the feature fail silently (this caused us a lot of headache…)
  • We needed to log onto each individual search server to restart it

The XML schema problem appears in the event viewer and looks like this:

clip_image002

We found that removing the XML namespace reference at the start of the XML file solved the problem.

To make a long story short, we have created a script which handles everything you need:

  • Converts CSV to XML
  • Replaces, &, :, ;, etc. with spaces
  • Barfs at you if you have duplicates
  • Looks up the correct guid-folders for your servers
  • Distributes the XML files
  • Does an automatic restart of the OSearch14 windows service on all servers listed

Configuration is done via a simple PowerShell file which is passed as input to the main script. A couple of things to note regarding configuration:

  • An arbitrary number of columns can be added - just make sure to add corresponding dummy-headings. Headings are required for PowerShell to parse the CSV properly.
  • Deploying to "tsneu" is probably sufficient, as is done in the supplied config, but you can add a list of languages if needed.

Download the attached zip for the complete source. Thanks to my colleague Stig Lytjohan for doing much of the hard work! Smilefjes

Enjoy!

ConfigureSearchSynonyms.zip

Lost authentication cookies in SharePoint

$
0
0

Are your users complaining about losing their session in your SharePoint environment? Or do your users complain about being thrown out after a while? This might be the explanation.

When the user has entered his credentials, a cookie is returned to the client (when using claims authentication or other cookie-based authentication methods). This cookie is contained in the header of the requests as long as the user is logged in.

Internet Explorer stores up to 50 cookies per domain. This means that if your code generates a lot of cookies, your authentication cookie might sooner or later be lost (as cookies are handled on a First-In-First-Out basis).

SharePoint list view pages generates cookies (normally 2). After browsing a number of list view pages the browser will receive a high number of cookies. When the number reaches 50, the authentication cookie will be replaced in IE. What the user will experience is that he is "thrown out" (redirected to the log-in page).

If we log on through firefox we can easily see all the cookies generated through firebug


This picture shows the "View all site content page". There are currently 6 cookies from the site domain. After browsing all the list view pages linked from this page we get the following cookies:

As you might see at the picture, the cookies stsSyncIconPath, stsSyncAppName, databaseBtnText and databaseBtnDesc (depending on list type) cookies are written to list relative paths. After browsing a number of pages like these, the number of cookies will sooner or later reach 50. If the user is browsing through IE, the authentication cookie (FedAuth in this example) will be lost.
Note: if you debug the same behavior through Fiddler you will not be able to see these cookies as they are generated client side (through javascripts).

If you do experience this issue you could consider a couple of workarounds:

- Wait for a future fix (the behavior will hopefully be changed in one of the next SharePoint CUs)

- Avoid exposure of list views unless you really need them

- Take a look at how the cookies are generated (hint: ows.js/init.js). Remember that modifying the out of the box files is not supported by Microsoft

How to copy a file via remote desktop connection

$
0
0

In some cases, you may need to copy a file from your local computer to a server, or vice versa, via a remote desktop connection (RDP). In most cases, you can easily do it by exposing your local disks directly to the remote machine. But in some case, this is not possible. For example, where I work now, I have to RDP to one machine, and then from there to a third machine. Copying a log file from that external machine to my local PC for further investigation is then a challenge.

Fortunately, I colleague of mine, Denis Heliszkowski, had a cunning solution. Starting on your source environment, follow these steps:

  1. Copy (Ctrl-C) the file from the Explorer window
  2. Open WordPad and paste the document in there, it will then look something like this:
    image
  3. Now, copy this file again, but now from WordPad.
  4. Switching to your target environment (typically your own PC)
  5. Open WordPad and paste into the empty document
  6. Again, copy this file from WordPad
  7. Open Explorer and paste

Voila – your file has been transferred to your local disk!

This of course also works in the other direction, which can be useful if you need to copy something like the Sysinternal tools to a server which does not have connectivity to the internet.

I must admit that I cannot in detail explain why or how this works. According to Denis, it’s about the file object being converted to a COM object when inserted into a document. I gues the most important thing is that it just works. Smilefjes

Viewing all 20 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>