.NET Tutorials, Forums, Interview Questions And Answers
Welcome :Guest
Sign In
Win Surprise Gifts!!!

Top 5 Contributors of the Month
Gaurav Pal
Post New Web Links

crawl stuck on "stopping"

Posted By:      Posted Date: August 26, 2010    Points: 0   Category :SharePoint

Hi all

I have a problem wiht Moss 2007 search crawl. It was working fine, and suddenly it didn't show new content. I tried to trouble shoot, and saw that it had been running a crawl for more than 2000 hours. I stopped the crawl, and now it's stuck on "stopping".

I have googled and seen that a lot of people had that problem, and this might be because of maintenance job on the sql server (2005) with duplicated index values in the search database, or not having sp2 for sql. I checked, and we didn't have that problem.

Anybody here that has been on this problem, and fixed it? :)


View Complete Post

More Related Resource Links

User Profile Service Synchronization Service Stuck in "Stopping" State


Have a customer who's UPS Synchronization service is stuck in a "Stopping State"

The UPS service Application has been deleted.  A Stop-SPServiceInstance command has been run and the Get_SPServiceInstance command shows the service is "unprovisioning"

Any ideas on what can be done next to get the service in a stopped state?

MVC2 issue - Stuck. Trying to create a cms with nested partials


I am currently trying to display multiple items on a single page. I am not sure as to how to tackle this. What I have is a database table that has all the page data in. I have this run as a PagesController and this works fine. However what I would like to do is, if say the Products page is selected via the menu, I want to pull back not only the Products page html but also get all the products, and even be able to pull them through via category. I was trying to use a partial view, but I can't get it to work. The same would be for other pages, having partial views to display the other content, ie if Gallery is selected, the page html for Gallery shows and it then can push to a partial view that pulls all the gallery images from the gallery db table and so on. 

Maybe I am tackling this the wrong way. I have the pages stored in the db to allow for a CMS system I have setup. Maybe I need to use models for all the other items(Products, Gallery, Videos and such) and use the partial for the page html?

Any help would be greatly appreciated as I am confused.

Spider in .NET: Crawl Web Sites and Catalog Info to Any Data Store with ADO.NET and Visual Basic .NE


Visual Basic .NET comes loaded with features not available in previous versions, including a new threading model, custom class creation, and data streaming. Learn how to take advantage of these features with an application that is designed to extract information from Web pages for indexing purposes. This article also discusses basic database access, file I/O, extending classes for objects, and the use of opacity and transparency in forms.

Mark Gerlach

MSDN Magazine October 2002

Can't start crawl becasue of index move operation


Hi there,

I can't start crawl task. The log says that "Deleted by the gatherer (The start address or content source that contained this item was deleted and hence this item was deleted.)" But I did not change the path of content sources and When I trie to start crawling job it says "Crawling might be paused because a backup or an index move operation is in progress. Are you sure you want to resume this crawl?"

What is index move operation? What should I do? I'll really appreciate the solution greatly. Thanks in advance.



"Content for this URL is excluded by the server because a no-index attribute." in crawl logs


Hi All,

I am getting following error message in Crawl Logs

" Content for this URL is excluded by the server because a no-index attribute. "

Any help in this regard will be greatly appreciated.



infopath form stuck with installing or deleting



I am deploying an infopath form in central admin and it stuck with the message "installing" and if i try to remove it stuck with message 'deleting". I restarted timer as well admin jobs in all servers (front end and app) but it is still showing the same. I also renamed my form and tried..no result. Can any one suggest me how to resolve this? I am wondering how it s happening just from one week.there are no changes in servers as well in environment. It works well before.



Trying to get to a page via an email link. Could the sitemap be stopping this?


I had this working.  I would email a link to a page (i.e. Work.aspx), and using the code below, the user would be routed to the login page and then sent  to Work.aspx after logging in.

But since I added a site map this isn't working.  I always go directly to my home page.  If I set Work.aspx as my start page in VS it works, but not if I click a link in the email.

Any thoughts?

 protected void Page_Load(object sender, EventArgs e)


        if (!Page.User.Identity.IsAuthenticated)

            Response.Redirect("~/Login.aspx" + "?ReturnUrl=~/Work.aspx");


    protected void Page_Load(object sender, EventArgs e)
        if (!Page.User.Identity.IsAuthenticated)

Status "Installing" keeps stuck

Hi All..

I have a problem about status that is stucked.. I've read many forums that related to my problem, but still can't solved it yet :(

Here is my problem...

After 3 months I keep on uploading, removing, and uploading form, I have no problem.. Until today, when I want to upload my form, the status keeps on "Installing", instead of "Ready".
Then I removed my form using Timer Job Definitions, and I delete form.wsp. It works.
Afterwards, I try to upload form again, I have the same problem with stuck status. And then I delete again using same way.
I've tried more than 10 times, and the result still the same one.

Things that I've tried are:
- Reset IIS.
- Reset Application Pool.
- Check the services (e.g: Windows Sharepoint Services Administration, Search, Timer, Tracing) -- all are already started.
- Use "stsadm -o execadmsvcjobs"
- Reset Infopath Form ID
- Publish and change Infopath Form Name
- Command "net start SPTimerV3" and "net start SPAdmin"

Tried all things above, but still can't work. Status is still stuck.

Then, I try to upload other forms, but still have the same problem.

Please help me. I'm stuck already... :((

Is there anything I can do again?


Only crawl one site collection

Hi We have an intranet with about 100 site collections. How can I set up one of those to be in a separate content source that can be crawled more often? Do I need to make two content sources with one containing the other 99 site collections with the setting "Crawl only the SharePoint Site of each start address" and the other one containg my prioritized site collection with the same setting? I also would like to ask if the crawl rules have any effect on in which order the content is crawled. If I put a certain site to be included with order 1 will that site always be crawled first? //Niclas

Issues regarding sockets (stopping receiving packets).

Hello, I've created a server for my game, but for some unknown reason the server (most likely) stops receiving packets while the client still tries to send them. I've tried to use a couple examples from MSDN, but it still just stops receiving. This is what I think that happens: Client connects to server Server sends packets as response Client responds to those packets Server sends a lot of packets (one packet at a time) Client tries to send packet, but it kinda fails (it does send packets though!) Server waits (doesn't get any packet anymore) Client triggers timeout code on the server side and gets disconnected from the server Does anyone have an idea how and why this can happen? Here are the receive functions I use: BeginReceive (SocketInfo includes data such as amount of data to receive, state of packet (header or content) and the buffer) public void BeginReceive(SocketInfo socketInfo) { if (mDisconnected != 0) return; try { args = new SocketAsyncEventArgs(); args.Completed += (s, a) => EndReceive(a); args.UserToken = socketInfo; args.SetBuffer(socketInfo.DataBuffer, socketInfo.Index, socketInfo.DataBuffer.Length - socketInfo.Index); if (args != null) { try { if (!Socket.ReceiveAsync(args)) EndReceive(args); } catch (ObjectDisposedException) { Disconnect(); return; }

C# newbie stuck - trying to access column data in a SharePoint list in an SSIS script task

Hello, I'm sure this is the simplest question but I can't figure it out, even with Google's help. I am trying to stumble through some C# code in an SSIS script task and I am frustrated that I can't figure out how to do the easiest things.  I eventually want to find data in a column,and then use another list as a lookup to replace that value with another where the existing value matches a value in the lookup list.  So, the data in my (multiple choice) column might be "apples; bananas" and in another list I have a row that contains two columns, the first holding the value "Apples" and the second containing "Red Delicious" and my original column should read: "Red Delicious; bananas." But, alas, I can't even figure out how to see the data that is in a column. Here is my code: /*<br/> Microsoft SQL Server Integration Services Script Task<br/> Write scripts using Microsoft Visual C# 2008.<br/> The ScriptMain is the entry point class of the script.<br/> */<br/> <br/> using System;<br/> using System.Data;<br/> using Microsoft.SharePoint;<br/> using Microsoft.SqlServer.Dts.Runtime;<br/> using System.Windows.Forms;<br/> using Microsoft.SharePoint.Utilities;<br/> <br/> namespace ST_08becda4c05c49cd9f30ea76110076cd.csproj<br/> {<br/> [

Any advice on stopping the follwoing hacker (Meher Assel)

 Hi, All    I a have just logged into my hosting provider and all 3 of my sites have been hacked by the same perspn (Meher Assel). Has anyone came across this person before and any advice on how to stop it. All it looks like has been done is files created in my directoty saying I have been hacked on my hosting provider. Many Thanks Kered

SharePoint crawl errors on files which are not present

All, I'm noticing 2 errors in my crawl logs. Neither of the files exist anywhere on our site. The URLs are http://.../forms/repair.aspx and http://.../forms/combine.aspx and the error message is 'Error in the Microsoft Windows SharePoint Services Protocol Handler'. Our crawl normally takes about 3 and a half hours. Recently, it's been taking 5-6 hours. These 2 errors are logged at the end of the crawl. While the crawl is running, I see the success count growing and at about 3 and a half hours into the process, the success count stops growing. I'm not sure what the crawl is doing for the next 2 or so hours, but if finally logs the 2 errors mentioned earlier at the end of the crawl, then completes. I have tried resetting the crawled content and changing the index location of the SSP, but neither have worked.  I have also tried excluding the path to these two files with crawl rules, but that hasn't worked. I am on SharePoint 2007 SP2. Any ideas? Thanks

WorkFlow will not start. Stuck on workflowRuntime.CreateWorkflow?

In Visual Studio 2008/C# using the wizard I created a 'Sequential WorkFlow Console Application'.  I have made no changes to the project genreated by Visual Studio 2008.  The code gets stuck on:   WorkflowInstance instance = workflowRuntime.CreateWorkflow(typeof(WorkflowConsoleApplication1.Workflow1));Application never timeouts or fails.Any ideas?twahl

Cannot crawl sharepoint site and mysite after database attach upgrade form sharepoint 2007 to 2010.

After database attach upgrade site and mysite from sharepoint 2007 to 2010 , I have full crawl and get "The crawler could not communicate with the server. Check that the server is available and that the firewall access is configured correctly. If the repository was temporarily unavailable, an incremental crawl will fix this error. ( Error from SharePoint site: HttpStatusCode ServiceUnavailable The request failed with HTTP status 503: Service Unavailable. )" for mysite and get "Access is denied. Verify that either the Default Content Access Account has access to this repository, or add a crawl rule to crawl this repository. If the repository being crawled is a SharePoint repository, verify that the account you are using has "Full Read" permissions on the SharePoint Web Application being crawled." for sharepoint site. The content access account for search is "db_owner" of both of site and mysite. How do I solved this problem ?

FAST Search crawl queue

I am attempting to do a full crawl (FAST Search) of a handful of Word, Excel, PowerPoint documents. Has been running for ~3 hours. If I look at the Crawl Queue report in the Administrative Report Library it shows 150 transactions queued. Crawl Processing per Activity - the last entry on the graph is 300 seconds for Initializing Document Filtering for what looks like the entire 2 hrs SQL Server, SharePoint Server and FAST Search servers appear to all have low utilisation (cpu, memory, disk). There are only 2 warnings in the FAST Search crawl log (don't crawl search site and search cache directory). Success = 0, Error = 0, Warning = 2. Before I setup FAST Search, SP Search took approx 6 minutes to crawl a similar list of documents. How do I troubleshoot the issue?
ASP.NetWindows Application  .NET Framework  C#  VB.Net  ADO.Net  
Sql Server  SharePoint  Silverlight  Others  All   

Hall of Fame    Twitter   Terms of Service    Privacy Policy    Contact Us    Archives   Tell A Friend