Quantcast
Channel: SharePoint 2013 - Setup, Upgrade, Administration and Operations forum
Viewing all 21070 articles
Browse latest View live

Want to take a content type already in use in many site collections and use it in a content type hub to update the content type in those collections?

$
0
0
I have a custom content type call it "Custom Content Type" which is already in use in many 6 different site collections.  I now have a content type hub.  I want to be able to update the "Custom Content Type" in the content hub and have it propagate to the site collections instead of having to edit the type in the 6 different site collections.  Can I do this or do each of these content types have a different ID even though they have the same name?

Mobile Contemporary View not working in SharePoint 2013

$
0
0

I've just stood up a pilot site for some users to play around with in SharePoint 2013, however even though the 'mobile view' feature is enabled when I hit the site it fails to load. I receive an error message Exception of type 'System.ArgumentException' was thrown. Parameter name: encodedValue' and it wont load.  I was under the impression out of the box the 'contemporary view' should load as long as the feature is loaded.  I haven't got into device channels yet or anything but I would like the users to see the new mobile experience.  Any help would be greatly appreciated.

I should have also mentioned it is a team site with publishing features enabled.

Thank you.


Suppress js files in SharePoint 2013 compatible mode

$
0
0

I have .js files like rcui.js;SP.Ribbon.js suppressed for anonymous users in my 2010 farm. 

Now i have upgraded the site to 2013 and running in compatible mode. 

when i browse the site in Fiddler i see the .js files being hit and same when i debug using Firebug in Firefox. 

the anonymous site is still little fast than the authenticated site. 

but i am wondering if things have changed in 2013, do we need to rewrite the code we wrote for 2010 to suppress the .JS files.

I am not able to find out anything on my search so wondering anyone came across this.


kukdai

Crawl errors: Sandbox worker pool is closed

$
0
0

Hi

We are currently setting our SharePoint farm and got into an issue I could not find any information about.

After setting multiple content sources we get the following errors that can be found on the Crawl Log for some of the items:

Processing this item failed because of an unknown error when trying to parse its contents. ( Error parsing document 'http://site/url'. Sandbox worker pool is closed.; ; SearchID = 6A9F9644-31EE-4796-B355-D92C35D50973 )

and

The content processing pipeline failed to process the item. ( Error parsing document 'http://site/url'. It was not possible to acquire a worker. Proxy '131' failed to acquire worker. Sandbox worker pool is closed.; ; SearchID = 194A2340-CB92-4F36-AAB8-C55EC794F7C3 )

A few more characteristics:

  • This does not occur for all items on content source, but only for a few of them.
  • When it starts it takes a very long time until the crawl completes.
  • After rebooting the back end and the search query servers this error is usually gone (we can start crawling items successfully) but after a few hours it is back.

Any insight would be much appreciated.

Thanks

Avishay


Transitioning to SharePoint 2013 Foundation Companywide: Overwhelmed...Where to Start?

$
0
0

We have various shares within the company. They are accessed everyday, and we've gone from 20 employees to 160. We need content management badly. I have SharePoint 2013 in place, but I'm overwhelmed. Where do I start?

1. P: Drive....On MediaServer.....We have 2TB of information on one server's drive which is shared out. It's our Photos share.

2. M: Drive, L: Drive, Y: Drive, S: Drive....On Server#1.....1.6 TB

The plan is to eliminate all those shares and have departments access their content within their own SharePoint site. We don't want anyone using shares anymore, and we will disconnect the shares so no one can access them again.

Do I import all that content into SharePoint? If so then what kind of redundancy should I have in place? Does that mean that ALL the company data would reside with the SQL Server SharePoint uses?

If the only SharePoint server we have goes down how can users access their information? Do they just wait until we fix the server? Should we have two SharePoint servers and two SQL servers running?

I have no idea how to proceed. Any help would be appreciated.

SharePoint 2013 disable Save/ Share or Print option in Office Web Apps for PDF and Word Docs

$
0
0

Hi,

How can we disable Save/ Share or Print option in Office Web Apps for Word and PDF files.

Regards,

Troubleshooting a PowerShell backup routine.

$
0
0

Hi all,

I am pretty new to PowerShell, but have come up with a script to perform nightly backups of my SharePoint Farm, which I in turn copy to a fileserver.  75% of the time, this backup works, and I get a ZIP file.  However, about 25% of the time it doesn't work, and nothing appears.  Interestingly, based on the "last modified" dates of the local backup and remote backup folders, it seems like some activity was occurring (e.g., farm backup created, zipped, and even possibly copied), but no end-product zip file.  I am guessing it's something with my script.  Could somebody take a look and give me some insight as to what might be wrong?  Thanks!

#Variable declarations.
$date = Get-Date
$localBackupDir = "C:\Backups\Farm\" + $date.Year + "." + $date.Month + "." + $date.Day
$archiveFileName = $localBackupDir + "\" + $date.Year + "." + $date.Month + "." + $date.Day + ".zip"
$remoteBackupDir = "\\<file server>\sp_backup\<sp_server>"

#PS SP snapin.
Add-PSSnapin "Microsoft.SharePoint.Powershell"

#PSCX module, which enables zipping via PS. This needs to be installed on any machine in which this script is run.
Import-Module pscx

#Raw farm backup.
mkdir $localBackupDir
Backup-SPFarm -Directory $localBackupDir -BackupMethod Full -Verbose

#Zipping.  Should be noted that the farm backup compresses the database files somewhat already.
cd $localBackupDir
Write-Zip $localBackupDir -OutputPath $archiveFileName -IncludeEmptyDirectories -Verbose

#Move zip files to remote drive(s).
copy *.zip $remoteBackupDir
cd ..

#Delete local copies.
rmdir -Force -Recurse $localBackupDir

#Delete remote backups older than 7 days.
cd $remoteBackupDir
foreach ($i in Get-ChildItem $remoteBackupDir -Recurse)
{
if ($i.CreationTime -lt ($date.AddDays(-7))) {
Remove-Item $i.FullName -Force}
}

#Synopsis!
$endDate = Get-Date
Write-Host Backup started at $date and ended at $endDate. 

DistributedCache entries in SharePoint Log

$
0
0

Hi,

I notice that my log on WFE filled out with lots of entries for DistributedCache, do you have idea on how to turn that off?

Calling... SPDistributedCacheClusterCustomProvider:: GetValue(object transactionContext, string type, string key).
Successfully executed... SPDistributedCacheClusterCustomProvider:: GetValue(object transactionContext, string type, string key).

Thank you!

Regards,

Edwin


how can i configure Distributed cache servers and front-end servers for Streamlined topology in share point 2013??

$
0
0

my question is regarding SharePoint 2013 Farm topology. if i want go with Streamlined topology and having (2 distribute cache and Rm servers+ 2 front-end servers+ 2 batch-processing servers+ cluster sql server) then how distributed servers will be connecting to front end servers? Can i use windows 2012 NLB feature? if i use NLB and then do i need to install NLB to all distributed servers and front-end servers and split-out services? What will be the configuration regarding my scenario.

Thanks in Advanced!


Register-SPWorkFlowService error: The remote server returned an error: (404) Not Found

$
0
0

Hi everyone!

I am trying to register workflow service with "Register-SPWorkFlowService" cmdlet.

I have only one server named "spserver" with MS Windows Server 2012 R2 + Sharepoint Server 2013 + Project Server 2013 installed.

I intalled workflow manager and client. I configured it without SSL.

I have access to http://spserver:12291 page when i run my browser in "run as administrator" mode.

Now i am trying to reg a service with a Sharepoint MS. Command:

Register-SPWorkFlowService -SPSite "http://spserver/pwa" -WorkflowHostUri "http://spserver.sp.tke:12291" -AllowOauthHttp -Force

But get error. Ofc PWA site is accessible and works perfectly.

Help please, i am a newbie :)

Thanks.

Sharepoint Online - New-WebServiceProxy not authenticating

$
0
0

Hi all,

I'm working on a script to update data in User Profiles.  The normal sync chain for Office 365 doesn't permit customization the way you can do on-site so the Support folks provided me with a script to update user data directly from the on-site AD.

The script works but uses GetAuthenticatedCookiesto get the authentication cookies and stuff it into the New-WebServiceProxy object via a CookieContainer.  This works of course but requires a login through a pop-up.  Not useful to a timed background job.

My goal is to replace the popup with passed PScredentials.  The approach that I landed on was to simply use the -credential option of New-WebServiceProxy.  I swear it was working last night - had a few test runs and all was good.  Went back this morning to tidy up comments, did one last check, and GetUserProfileByIndex was throwing an exception that I had attempted to perform an unauthorized operation.  I'm baffled!  I am very (very!) new to PowerShell so perhaps this is just a newbe thing.  Maybe credentials were somehow being stored in my environment through something else I had tried and disappeared when I closed ISE. 

I've also noted that passing bad credentials, or allowing the popup and typing in jibberish, does not result in an an exception at the New-WebServiceProxy declaration.

So the question is "Should I be able to use PScredentials in an Office365 environment?".  I'm wondering if they might have restrictions in the claims based authentication world.

the script fragment:

# Local parameters
$userName = "admin user at tenant.onmicrosoft.com"
$password = "somepassword"
$siteAdminUrl ="tenant-admin.sharepoint.com"
#load required assemblies
$script_folder = (Split-Path -Parent $MyInvocation.MyCommand.Path)
[void][System.Reflection.Assembly]::LoadFile($script_folder + "\Microsoft.SharePoint.Client.dll")
[void][System.Reflection.Assembly]::LoadFile($script_folder + "\Microsoft.SharePoint.Client.Runtime.dll")
[void][System.Reflection.Assembly]::LoadFile($script_folder + "\ClaimsAuth.dll")
# Path to user profile service web service - ups
$ups_url = $siteAdminUrl.TrimEnd('/') + "/_vti_bin/UserProfileService.asmx";
# Put username and password into a standard credential object
$securePassword = ConvertTo-SecureString $password –AsPlainText –force
$O365Credential =  New-Object System.Management.Automation.PsCredential($username, $securePassword)
# And set up the service.  Providing credentials skips the login popups
#$ups_ws = New-WebServiceProxy -Uri $ups_url -Credential $O365Credential
$ups_ws = New-WebServiceProxy -Uri $ups_url -Credential $userName
$ups_indexresult = $ups_ws.GetUserProfileByIndex(-1)

sharepoint 2013 search broken after upgrade from 2013 foundation to 2013 ent.

$
0
0

Hello

I have tried many of the suggestions that I have found in the forums for this issue but non have worked as of yet.

I have sharepoint 2013  sp1 running on server 2012 R2 connected to SQL 2012 DB, which was upgrade from SharePoint foundation 2013. Everything is working with the exception of the search. the errors are below.


Application Server Administration job failed for service instance Microsoft.Office.Server.Search.Administration.SearchServiceInstance (1d094485-eab9-458d-a363-4b5e8704048f).

Reason: Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))

Technical Support Details:

System.UnauthorizedAccessException: Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))

   at Microsoft.Office.Server.Search.Administration.SearchServiceInstance.Synchronize()

   at Microsoft.Office.Server.Administration.ApplicationServerJob.ProvisionLocalSharedServiceInstances(Boolean isAdministrationServiceJob)

For some reason this is using the farm account which is not the account used for the search service but has full access to the sites and databases. So I'm not sure what or where it is getting an access denied error. Is there any way to find what it is trying to access? BTW I have disabled the loopback check, verified it has correct access to %windir%\tasks and recreated the search service many times, changed application pool accounts

Application Server Administration job failed for service instance Microsoft.Office.Server.Search.Administration.SearchServiceInstance (1d094485-eab9-458d-a363-4b5e8704048f).

Reason: The object you are trying to create already exists. Try again using a different name. 

Technical Support Details:
System.Runtime.InteropServices.COMException (0x80040D02): The object you are trying to create already exists. Try again using a different name. 
   at Microsoft.Office.Server.Search.Administration.SearchServiceInstance.Synchronize()
   at Microsoft.Office.Server.Administration.ApplicationServerJob.ProvisionLocalSharedServiceInstances(Boolean isAdministrationServiceJob)

Any help you can provide would be greatly appreciated.

Single SharePoint 2013 Farm for two geographical locations

$
0
0

Dear All,

I want your expert advise and suggestion on an "Architecture" for Deploying SharePoint 2013 in multiple geographical locations. I have read every line in article:http://technet.microsoft.com/en-us/library/gg441255%28v=office.15%29.aspx before posting this question, for the following scenario:

Scenario:

  1. Two locations, say, Loc1 and Loc2.
  2. SharePoint Farm = Collection of SharePoint servers connected to the same Configuration DB.
  3. WFE1 will be in Loc1 and WFE2 will be in Loc2 (and connected to the same config DB i.e. same Farm).
  4. App1 will be in Loc1 and App2 will be in Loc2 (and connected to the same config DB i.e. same Farm).
  5. Users from Loc1 will be directed to WFE1 and Users from Loc2 will be directed to WFE2.
  6. App1 will run services and service applications consumed by web applications specific to Loc1.
  7. App2 will run services and service applications consumed by web applications specific to Loc2.

I can implement all of the above.

Now question is "How to provide local flavor for back end SQL?"

suppose, I configure "Active-Active" SQL clustering SQLNode1 at Loc1 and SQLNode2 at Loc2, how do I specify that web applications meant for Loc1 are fetched from SQLNode1 and web applications meant for Loc2 are fetched from SQLNode2.

Users in Loc1 are not interested in web applications meant for Loc2 and vice-versa. Currently we are having Two (2) different farms at each location, but going forward we want to consolidate so that "Infrastructure maintenance" is minimized and investment in third party solutions can be leveraged by both locations.


Naveed.DG MCITP, MCTS -SharePoint 2010 Administrator "Vote As Helpful" If it helps!!

The specified user could not be found - ADFS with multiple realms

$
0
0
I am using a single trusted identity token issuer (ADFS 2.0) with multiple realms for different sites (urn:sharepoint:int-site1 and urn:sharepoint:int-site2).  I added my provider to both sites through central administration and the first site works fine and allows my external user to authenticate.  The second site gives me an access denied page (which I expected) and asks that I request access.  When I submit the request for access I get an error message back stating "The specified user username@email.com could not be found".  What could I be missing?

With SP1 + April 2014 CU, issue converting to claims

$
0
0

Hello,

I'm testing an upgrade to SharePoint 2013 on an environment running on 2012 R2, with the fixed SP1 and April 2014 Cumulative Update. My issue is that, in this farm, I cannot convert any of my classic databases to claims. I run into the following error for every single account that convert-spwebapplication tries to convert:

SPWebApplication '41cdd631-b97d-4d1e-ad46-69d5edf33564', SPContentDatabase '376ece3b-0e55-4020-91a5-cd8a1fc980b5', SPSite 'a8b2c432-e7d4-4684-915a-8e22b744ea32', SPUser '9': Could not get migration data for entity so SKIPPING. Check migrator for further logs. Entity Old Name: '[mydomain\myuser]', Old Key 'S-1-5-21-299502267-1532298954-682003330-416720', New Name: 'i:0#.w|[mydomain\myuser]', New Key ''

This behavior occurs whether I attach a classic mode database to either a classic mode or claims mode web app. I have confirmed that classic auth works fine when the DB is attached to a classic mode web app in SharePoint 2013. It's just that when I run convert-spwebapplication (or any of the various ways to trigger the user conversion to claims) I always get the above error and no one can log in on the claims web app.

I tested this process against a different farm I have running the April 2013CU and I do not have these issues. That farm is also on Windows Server 2008 R2, so I'm not sure if it's a server OS thing or perhaps something introduced in a later CU or SP1 or what.

Anyone run into this before?


Project Online Edit Schedule Permissions

$
0
0
I'm using Sharepoint Groups for Project Online, and for some reason when a user is in the Owners group (full control) of a project, he can edit the schedule, but when I put him in a custom group, Managers (ALSO with full control), Edit schedule is greyed out. Just for fun I deleted the OOTB owners group in one project now ONLY site collection admins can edit the project schedule. Any help appreciated. 

How to find correct server in KEMP environment?

$
0
0

Hello SharePoint Fam,

My current environment has 3 WFE's that site behind a kemp.  When a user receives a error message, there is nothing in the message that really tells you which server this error comes from which causes me to have to take a guess and hit each wfe and search through logs.  Is there a easy ie plugin or tool that could be used that would tell me which exact server is being used at that exact time during error?  Would be nice to just know which server to go to and that would save alot of time, all i would have to do is hit exact server and grab corrected timestamped file.

Thanks n advance,

Visio Services Error

$
0
0

When trying to load a Visio document in SharePoint 2013 anywhere, I get the following error dialog box:

Error    The server failed to process the request.  Ok Retry

When looking at the actual web front end, I see the following errors in the Event Log:

Event 8078, Visio Graphics Service

Failed to get raster diagram for visio file https://contoso.contoso.com/departments/foo/engineering/Architecture%20Diagrams/Sharepoint%20Topology.vsdx page default page ID Exception : System.ServiceModel.FaultException: The server was unable to process the request due to an internal error.  For more information about the error, either turn on IncludeExceptionDetailInFaults (either from ServiceBehaviorAttribute or from the <serviceDebug> configuration behavior) on the server in order to send the exception information back to the client, or turn on tracing as per the Microsoft .NET Framework SDK documentation and inspect the server trace logs.

Server stack trace:

   at System.ServiceModel.Channels.ServiceChannel.ThrowIfFaultUnderstood(Message reply, MessageFault fault, String action, MessageVersion version, FaultConverter faultConverter)

   at System.ServiceModel.Channels.ServiceChannel.HandleReply(ProxyOperationRuntime operation, ProxyRpc& rpc)

   at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)

   at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)

   at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)

Exception rethrown at [0]:

   at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)

   at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)

   at Microsoft.Office.Visio.Server.GraphicsServer.IVisioGraphicsService.GetRasterPage(RasterPageRequest rasterPageRequestContract)

   at Microsoft.Office.Visio.Server.Administration.VisioGraphicsServiceApplicationProxy.GetRasterPage(RasterPageRequest request)

   at Microsoft.Office.Visio.Server.ServiceWrapper.GetRasterPage(RasterPageRequest request)

So far, I've tried making sure that application permissions are correct (they are), doing IISResets, and a whole host of other things that I could find through Google/Bing.  Is there a quick and easy way to either correct the error and get Visio rendering again or just rebuild Visio services entirely? 

SSL site + LoopBackCheck = Search Access Denied error in SharePoint 2010

$
0
0

Hi there.

I have an issue on my farm with the Search Service crawler.

On the farm, we have 4 "regular" WebApplication, and 2 HTTPS sites.

In setting up the search Content Sources, I have created separate ones for each site.

When doing a full crawl, the two SSL sites get the generic "Access denied" message. (Access is denied. Verify that either the Default Content Access Account has access to this repository, or add a crawl rule to crawl this repository. If the repository being crawled is a SharePoint repository, verify that the account you are using has "Full Read" permissions on the SharePoint Web Application being crawled.)

Of coure, I have verified and my "SP2010SearchAccess" account has "Full Read" rights on all the web applications.

I logged in these SSL sites from another computer, so I know my account has "Full Read" access.

I'be been able to tie it down the issue to the LSA LoopBackCheck.  I know that one option would be to disable the LoopBackCheck all together, but it si not allowed by our corporate security team.

As a proof, if I browse the site using the https://sitename from the SharePoint server, I get the 401 error after being prompted 3 times for my credentials (I used the same SP2010SearchAccess account) - it doesn't let me in.  If I use the IP address (https://www.xxx.yyy.zzz), i get the certificate error, but it let's me in. 

So, I tried changing the Content Source to use the IP instead, and I changed the "Farm-Level Search Settings" option "Ignore SSL warnings" to "Yes", but it still doesn't work.

Does anybody have a suggestion, idea, fix?

Thanks!

Can't edit or open documents online with OWA 2013

$
0
0

Hi,

I have just built a test environment of Sharepoint 2013 Server with Office Web Apps 2013. Followed all the necessary steps according to this Technet scenario http://technet.microsoft.com/en-us/library/ff431687(v=office.15)#scenario1 (created a binding from sharepoint to owa server, and setted WOPI zone to internal-http, enabled OAuth over HTTP).

I can successful log into my Sharepoint, but I can't edit or view any Office documents. If I try to create/edit/view document, I will get "Server Error. We're sorry. an error has occurred. We've logged the error for the server administrator" message.

After changing WOPI zone to external-http, I can only view Excel documents without being able to create (get upload document prompt), without being able to edit/view other documents online (the uploaded document will be downloaded to local machine).

It would be greate if someone could help me out with this problem. 

Thank you in advance

Viewing all 21070 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>