Showing posts with label MSS 2008. Show all posts
Showing posts with label MSS 2008. Show all posts

Tuesday, April 13, 2010

SharePoint 2007 and Event Id: 6482 and 6398

I recently ran into 2 strange errors when deploying a SharePoint server. It so happens that this error occurred just after deploying the Adobe x64 iFilter pack.

Event ID: 6482 - Application Server Administration job failed for service instance Microsoft.Office.Server.Search.Administration.SearchServiceInstance (GUID).

Reason: Exception from HRESULT: 0x80040D1B

And

Event ID: 6398

The Execute method of job definition Microsoft.Office.Server.Search.Administration.IndexingScheduleJobDefinition (GUID) threw an exception. More information is included below.

Exception from HRESULT: 0x80040D1B

As a result, I was unable to navigate to the Search Settings in the default SSP.

It turns out that I had 2 duplicate entries in the registry for the Adobe PDF file type.
Check out the registry key:
HKLM\Software\Microsoft\Office Server\12.0\Search\Applications\GUID\Gather\Portal_Content\Extensions\ExtensionList


I removed one of them and we were off again.

This will happen when you add the file type by hand in the registry and then add it as a File Type in the SSP Host web application.

I hope it helps.

Friday, August 22, 2008

SSP Shared Scopes in Microsoft Search Server 2008

The Microsoft Search Server 2008 Search administration pages offered a central and easy administration to the search services in SharePoint.  The only thing it missed was the SSP Level Shared Scopes.  It appeared though that only the link was missing as the ASPX pages were still located on the disk.

To navigate to the SSP Shared Scopes from the Search Administration Pages navigate to /_layouts/ViewScopesssp.aspx?Mode=ssp from the SSP host.  eg. http://MyMSS2008Server-SSPWebApplication/ssp/admin/_layouts/ViewScopesSSP.aspx?Mode=ssp.

You can add this as a link to the Search Administration page (links section) to allow quick access to the SSP.

As a matter of note, the Infrastructure Update for MOSS 2007 fixes this and adds the Scopes to the left hand action menu of the updated Search Administration pages.

Wednesday, August 20, 2008

SharePoint Backup

Backing up SharePoint can be a daunting task.  There are so many different options. Sometimes more than one option required.  For example, MOSS native backup (STSADM -o backup) + File System backup for the 12 Hive.

So what is the best option?  This is a common question.  Microsoft has release a wonderful White paper on backup located here: http://technet.microsoft.com/en-us/library/cc262129.aspx (which Ivan Wilson; a colleague of mine reviewed).

Working in the Consulting World of IT, it is not always possible to implement a new backup solution (such as System Centre Data Protection Manager) within a customer's site.  As such, the lowest common denominator - MOSS Native Backup is always the safest option.

Utilising STSADM -O Backup is an easy way to achieve a good solid backup (and repeatable restore).  You can perform full and differential backups which can be scripted and then scheduled to run via Task Scheduler.

The other advantage of the native backup is that it will place the Search database and the Full Text Index in a consistent state before backing them up.  The command will pause any currently running crawls, perform the backup and the resume any paused crawls.  This is extremely important, especially if your MOSS solution is an Enterprise Search  Solution.  (What's the point in a Search solution that the recovery process requires a week of full crawl indexing!!!).

One thing to watch out for when using the native backup mechanism (besides some of the pointers in the document) is your database size and available disk space.

During the first part of the native backup, the command will query the SQL Server and retrieve the database file sizes for all of the databases that are being backed up.  It will then total these values and query the available disk space of the location of the backups.  If there is not enough space available, it will not continue with the backup.

Why is this a problem?  Well, if you size your databases to a maximum size (whether that is recommended is a different story), the query returns the physical database size and not the amount of space used.  Take for example, a content database sized to a maximum of 500GB but only 10MB is currently used.  In this case the backup will first check to see if there is 500GB of free space on the target backup location before it will continue regardless of the fact there is only 10MB of space available.

Friday, July 18, 2008

The Elusive Crawl Rule

I have been working on this rather large Enterprise Search project for a little while now. For some background, we are using Microsoft Search Server 2008 to index content where the meta data is contained in databases and the full text content is contained on network shares.

The project is at the point we we are bringing on extra content sources such as the traditional file share content. In order for this content to be indexed, our default content account will need to be granted read access to the file shares. At this point, my customer asked me where the password was stored for the content account.

Hmm, interesting question I thought. So we began hunting down where this information is stored.

Step 1: SQL Profiler. Create a crawl rule with a custom account and see what is logged. Lo and behold - nothing.

Step 2: Registry. Success. It turns out that the crawl rules, content sources and basically everything related to the index process is located in the registry.

Registry Root:HKLM\Software\Microsoft\Office Server\12.0\Search\Applications\<SSP GUID>\
Content Sources:...\Gather\Portal_Content\Content Sources
Crawl Rules:...\Gather\Portal_Content\Sites\...
Credential Cache:...\Gathering Manager\Secrets\...

So initially I thought, why the registry? In fact the answer makes quite a lot of sense. As you can only have a single indexer for a SSP, it makes sense to place this information in the registry where it can be locked down. So from a security point of view, you must compromise the server first to get access to the registry, then you would need to try and decrypt the secrets. Also performance would be increased as it is a local registry read.