Full Farm Backup fails – Causes and Solutions

Requirements to check before moving on to the Errors section..

  1. Shared Folder need to be created & full control has to be granted to timer job, SQL Server service accounts and CA’s app pool account
  2. SQL Server VSS Writer service, that facilitates the backup/restore, should be running
  3. SharePoint Administration Service should be running on all servers of the farm

Error 1: Here WSS_xxx is any database.

Object WSS_XXXX failed in event OnBackup. For more information, see the spbackup.log or sprestore.log file located in the backup directory.
SqlException: Cannot open backup device ‘\\Backup\spbr0000\0000015d.bak’. Operating system error 3(The system cannot find the path specified.).
BACKUP DATABASE is terminating abnormally.

Possible Reason Solution
The path given for the backup is wrong Check the path & Correct it. It must be a Shared folder
Insufficient privileges, i.e. The Windows SharePoint Services Timer V4 (SPTimerV4) and the SQL Server service account perform backup/restore operations on behalf of the requested user. Hence these accounts should have full control on the backup locations While sharing the folder, granting ‘Full Control’ to ‘Everyone’ will not work. The domain account(ideally), under which the Timer Service & SQL Server service execute, should be explicitly granted ‘Full Control’ on the shared folder

Find SQL Server Accounts that carry out backup/restore by checking the below services…

SQLService.png

Sharing the folder:

Folder Sharing

Error 2: User Profile Synchronization service instance causes this issue.

FatalError: Object UPS failed in event OnBackup. For more information, see the spbackup.log or sprestore.log file located in the backup directory.
SPDuplicateObjectException: An object of the type Microsoft.Office.Server.Administration.ProfileSynchronizationUnprovisionJob named “ProfileSynchronizationUnprovisionJob” already exists under the parent Microsoft.SharePoint.Administration.SPTimerService named “SPTimerV4”. Rename your object or delete the existing object.

Possible Reason Solution
 The ‘User Profile Synchronization service'(CA -> Manage Services on server) instance is in ‘Stopping’ state  Let the service instance be in a Start/Stop state. Not in ‘ing’ state.

Note: Usually this error is thrown when we try to unprovision a service instance which is already in ‘stopping’ state. Here the timer job may not be trying to unprovision the instance, however, it checks the job queue to ensure that no jobs are executing so that the system gets stored/backed up in a stable state.

Error 3: Search Service Application causing the issue

FatalError: Object Search Service Application failed in event OnBackup …

Possible Reason Solution
 The Timer job service account doesn’t have permission on the Search Service Application Grant Full Control permission to the timer job service account on the Search Service Application
 Search Service account doesn’t have permission to the shared folder Share the folder to the search service account with full control

Note: Try narrowing down the permissions as much as possible. It’s not advisable to grant Full Control unless it’s really needed.

SharePoint O365!! Why I see all my site contents are updated at once a few hours ago?

Most of the sites, I’ve been working with, have not been updated in the recent past. However, one fine day I opened the site content and all the lists and libraries were showing the content was modified a few hours ago. It was not a big deal at that time to figure it out what was the reason behind those ‘all of a sudden & all at once‘ updates. Anyhow, it is time now to see what’s going on. Then I found a SPUser Voice shouting at Microsoft about the same issue( reflect the content changes not the system changes). This article is to present some findings about the ‘Unexpected Modified Dates’. And one of the factors being the REST Query. PFB, the screenshot..

   LastModified_Doclib

Recently when I retrieved a list item, I found three different date attributes came along as highlighted below.

<entry xml:base="https://organization.sharepoint.com/sites/somesite/_api/"...>
    <id>e11700c1-97d3-4451-8544-0bd6d1e4c995</id>
    <category term="SP.Data.TestFormListItem" scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" />
    <link rel="edit" href="Web/Lists(guid'e743b81c-0245-4970-a87d-9692d42813cc')/Items(<ID>)" />
    <title />
    <updated>2016-07-20T19:58:50Z</updated>
    <author>
        <name />
    </author>
    <content type="application/xml">
        <m:properties>
            <d:Modified m:type="Edm.DateTime">2016-06-20T12:35:20Z</d:Modified>
            <d:Created m:type="Edm.DateTime">2016-05-31T11:33:56Z</d:Created>
        </m:properties>
    </content>
 </entry>
  • Created : Attribute of the content, tells when the item/document is created
  • Modified : Attribute of the content, tells when the item/document’s metadata is last modified
  • Updated : Attribute of the list item, tells when the list item is last updated.

Now, More Insight…

Before the REST Query

As we can see in the below image, the given lists were updated at the same time. BeforeRestQueried

Then I though of observing which REST query is triggering the mysterious update mechanism. Then this happened…

AfterAllListsQuery

And then this…

AfterQueryModifiedUpdated

At this point, after observing it for multiple times I came to an understanding that every first time the items are retrieved by using REST query, the date value is getting updated. Since all the above requests are get requests, I tried to access the same resources from the site contents. However, accessing so didn’t update the modified date.

I’m really ambiguous about making any further comments on this behaviour and Microsoft’s thinking about how this ‘Updated’ is more useful than ‘Modified’ time. However, whatever is the intention of Microsoft, presenting it as the last modified date to the user is so confusing/miss-leading. And it raises questions about the integrity of the product.

Interesting thing is this behaviour occurs only with REST queries. I tried the below things as well..

  • Click on link title and open an item
  • Managed client object model code equivalent to “/items”

However, the “updated” property of the list object didn’t get updated. Final words…

The modified date, for lists, we see in the site contents is getting updated when a user accesses the “items” using REST query (using either code or directly hitting the url in  browser’s address box) for the First Time – So is it for analysing the REST activities of unique uses on the list? Nay, doesn’t make sense.

However, it’s not just about the REST calls – It is one of the reasons. There must be multiple reasons causing this behaviour.

Riddles here :

  1. What is the use of the “Updated” property?
  2. Somehow, I figured out one way that is causing the “Modified” date change for lists. And the question is, what is causing the change in “Modified” dates of libraries & other objects?

Download a file from SharePoint using CSOM

Looks like a silly question. But it took me two hours to solve.
– I faced an issue while downloading a file… not from the browser, though.
My requirement was to read the data from an excel file that is stored in a document library. BTW, I was using OpenXML libraries to parse the excel data, which requires either the path of the file or the stream as a parameter. Since it is not possible to provide the path of a document that was stored in a library, I had to pass the stream as the argument.
Exactly here I encountered an issue when I tried to read the file as stream.
The name of the issue is… “NetworkStream“, one of the least discussed class.
A Brief History of “NetworkStream”:
It inherits from ‘System.IO.Stream’. And so the methods derived in it.
However, it denies any seek operation we ask for. In fact this is the first time
I found this class.

	
// The below statement returns the NetworkStream Object
// fileInfoObject is of type - Microsoft.SharePoint.Client.FileInformation
var stream = fileInforObject.Stream;

As per the documentation, it should return a standard stream. But it returns a stream object that doesn’t support any Seek operations(Length, Seek ..etc). Read here.. for the info.

In fact I tried couple of methods I found over the internet and found them of no use.
As a result of the experiments, I was left with a corrupted stream object saved to disk as a file that scolded me very badly, in machine language, when I tried to open it. Then, further search awarded me with a way to deal with the file – using the System.Net.WebClient class that provides methods for sending data to and, in my case, receiving data from a resource identified by a URI.

So, it is an easy operation as shown below when we want to get the file as stream from a site that readily authenticates us with the identity used to run the code.
Here, for authentication purpose, we can assign either NetWorkCredentials object or CredentialCache.DefaultCredentials to the WebClient.Credentials;

using (WebClient webClient = new System.Net.WebClient())
{
   // excelFileUrl - path/url of excel file that is uploaded to a document library
   
   Stream streamObject = webClient.DownloadData(excelFileUrl)
   // now I can do anything with this stream - such as...
   // 1. Saving to disk; 
   // 2. Passing as an argument to OpenXML library method to parse it.
}

It would have been end of this post if my struggle ended here. However, my actual requirement is to read the data from Office 365 SharePoint site. Here the problem is the ‘Authentication’. Since our request doesn’t carry any authentication tokens/cookies, the server rejects the request saying we don’t have access. Here I tried to use my brain but it hardly paid off. The method I tried..

 webClientObj.Credentials = sharePointOnlineCredentialsObject;

When I observed the request pattern using fiddler, I found that no FedAuth cookie is associated with the request. Then again google helped me. The solution is to make the WebClient class as Cookie aware. That is, to associate the WebClient with a cookie container that gets added the authentication cookies when we pass valid credentials.
The below method describes how to make WebClient cookie aware. We do a series of steps as..
1. Adding a cookie container to the child class – it helps storing the authentication cookies
2. Overriding the GetWebRequest method – it makes the web request associated with an authentication cookie as a header
3. while overriding, we add the UserAgent string to the web request –

In the absence of the UserAgent string, cookie will not get added to the container. By adding it, our web request mimics a request sent by a browser. This is any valid UserAgent string – need not be the one I’ve given below.

// Method [1]
// This class extends the capabilities of the WebClient class by adding 
// a CookieContainer to it.
class AuthenticatedWebClient : System.Net.WebClient
{
public System.Net.CookieContainer WebClientCookieContainer { get; private set; }
public AuthenticatedWebClient()
{
WebClientCookieContainer = new System.Net.CookieContainer();
}
protected override WebRequest GetWebRequest(Uri url)
{
var request = (HttpWebRequest)base.GetWebRequest(url);
//Adds the existing cookie container to the Request
request.CookieContainer = WebClientCookieContainer;
request.UserAgent = "Mozilla/5.0 (Windows NT 6.0; rv:12.0) Gecko/20100101 Firefox/12.0";
return request;
}
}

Now the downloading part – The below snippet tells how to pass credentials along with the Cookie aware WebClient class. We pass credentials, using the NaveValueCollection object, to the login url of the Office 365 site.

using (AuthenticatedWebClient authenticatedWebClient = new AuthenticatedWebClient())
{
System.IO.Stream filestream = null;
var values = new NameValueCollection{{ "username", "saratchandra@myorganization.com" },{ "password", "mypassword" }};
authenticatedWebClient.UploadValues("https://login.microsoftonline.com/login.srf", values);
string url = "https://myorganization.sharepoint.com/sites/development/Documents/TestRule.xlsx";
byte[] content = authenticatedWebClient.DownloadData(url);
filestream = new System.IO.MemoryStream(content);

// The below code is a custom helper method to parse the excel data
OpenXmlExcelHelper.GetExcelData(filestream);
}

With the method explained above, everything seemed to be under control. However, the result again was an incomplete chunk of byte stream – which again is not useful.

This time I really had to use my brains and…this time it worked.. The below is the code snippet for that. A simple way of making a WebClient cookie aware without struggling much.
Here, the `credentials` object is of type – SharePointOnlineCredentials

// Method [2]
using (WebClient webClient = new System.Net.WebClient())
{
webClient.Headers.Add("Cookie", credentials.GetAuthenticationCookie(new Uri("https://myorganization.sharepoint.com/sites/development")));
string url = "https://myorganization.sharepoint.com/sites/development/Documents/TestRule.xlsx";
byte[] content = webClient.DownloadData(url);
filestream = new System.IO.MemoryStream(content);

// custom method to parse excel data
OpenXmlExcelHelper.GetExcelData(filestream);
}

Get number of hits to a page – SharePoint(2013) Analytics

This post briefly explains how we can get the total number of hits to a page in SharePoint. In SharePoint 2013 Usage Analytics is integrated into Search Service Application (Pretty old news, a pre-historic event).
SharePoint analytics is useful in improving the Search experience by letting know the most popular items or most visited pages.
The code I’m going to explain will work fine. However, the below are the things to check before proceeding.

  1. Search service application must be attached to the web application(Central Admin -> Manage WebApplications -> Service Connections)
  2. Windows Timer service must be running

Know the role of Timer Service and SSA here

public static int GetNumberOfHitsPerPage(Guid siteGuid, SPListItem item)
{
   int totalHits = 0;
   
   // Total hits per page
   using (SPSite siteCollection = new SPSite(siteGuid))
   {
     using (SPWeb rootWeb = siteCollection.OpenWeb())
     {
        // Method 1:
        // How it works?
         SPServiceContext serviceContext = SPServiceContext.GetContext(siteCollection);
         ISearchServiceApplication searchServiceApplicationProxy = SearchServiceApplicationProxy.GetProxy(serviceContext);
         SPList pagesList = item.ParentList;
        // What is '1' stands for in the below line of Code?
         AnalyticsItemData analyticsItemData = searchServiceApplicationProxy.GetAnalyticsItemData(1, pagesList.ID, siteGuid, Convert.ToString(item.ID));
         totalHits = analyticsItemData.TotalHits;

        // Method 2:
        // How it works?
        // using UsageAnalytics Class
         UsageAnalytics analyticsData = new UsageAnalytics(siteCollection);
        // What is '1' stands for in the below line of Code?
         AnalyticsItemData analyticsItemData2 = analyticsData.GetAnalyticsItemData(1, item);
         totalHits = analyticsItemData2.TotalHits;
       }

       return totalHits;
    }
}

What is ‘1’ stands for in the below line of Code? Ans: EventTypeId

Here 1,2,3 and 4 are EventTypeIds. I believe there should have been a Enum type that represents these events instead of sending them as bare numbers.
1 – Views (number of times an item is viewed)
2 – Recommendation Displayed (number of times an item appeared in the recommendations – Popular Items/Recommended Items WebParts)
3 – Recommendation Clicked (number of times the link to an item was clicked when displayed in the recommendations)
4 – Search Queries (internal event for reporting purposes)

Method 1:

Important step in this method is getting the Service Context using SPServiceContext.GetContext(..):
We have two overloads of this method.

  1. GetContext(SPSite):
    Takes site collection object as a parameter. Then gets the Service Application ProxyGroup
    associated to the web application in which the site collection resides. Then creates the
    SPServiceContext object using the Service Application Proxy group object and id(of type GUID)
    of the Site Subscription object(spSiteObject.SiteSubscription)
    What is a Site Subscription?

  2. GetContext(HttpContext):
    Takes HttpContext object as parameter. Then gets the SPServiceContext value in one of the following ways and returns the object

    				// httpContext.Items is of IDictionary type
    				(SPServiceContext) httpContext.Items["Microsoft.SharePoint.SPServiceContext"]; // When HttpContext is not null
    				(or)
    				(SPServiceContext) SPThreadContext.Get("Microsoft.SharePoint.SPServiceContext"); // When HttpContext is null
    				

After getting the Service Context it’s pretty straight forward. Next we get the SSA proxy and from that analytics data.

Method 2:

In this method as a first step we create the object of ‘UsageAnalytics’ sealed class.
UsageAnalytics class has only one parameterized constructor that takes ‘SPSite’ object
as input parameter.
Then we get the ‘AnalyticsItemData’ object, contains the historical analytics data
for an item, from which we get the number of times the page was visited.