Monday, November 19, 2012

One thing that can cause an internal AD account lockout...

I consolidated a domain recently and moved the domain controller to use our primary domain instead of its own distinct domain (long story why it was setup this way to begin with). However, shortly after my user account started getting locked out periodically. The problem was that something was trying to login using olddomain\myaccount, but the olddomain domain didn't exist anymore. Since I have an account on newdomain named myaccount, it was locking that account out instead seeing multiple invalid logins for that username.

I first figured it had to do with a service running using the old credentials, but ruled that out quickly. The Windows security logs were less than helpful, besides allowing me to see the times of the login attempts. I then turned to Wireshark but didn't find anything useful. Now that I had no evidence of it coming over the network, I determined it must be happening on this newly migrated DC itself. I enabled Netlogon logging, which confirmed that something on the newly migrated DC was using my credentials, but what?

Then I finally ran across this post. I had completely forgotten that dynamic DNS in DHCP requires you to setup login credentials. Sure enough, this was the cause of my problem. It's not best practice to set it up the way I had, but regardless, that's the way it was. I changed the dynamic registration credentials and lo and behold, no more lockouts.

Thursday, August 30, 2012

An openupload installation better explained

If you're not familiar with it, openupload is an open-source, web-based, file-sharing system kind of like a rapidshare or mediafire. It's written in PHP and will run fine on a LAMP server. There hasn't been any new versions published since November 2010, but I'm still able to run in on the Ubuntu 12.04.1 without any changes. It appears to be highly recommended on Sourceforge, and so far so good for my setup once I was able to get it to actually work.

I had my fair share of problems getting openupload to run using my limited knowledge of PHP and Linux, but  after roughly following this guide I was able to get it working will be rolling it out as an FTP alternative at my workplace. The lack of setup documentation was one of the frustrating problems I ran across though, so I'm going to share what I've been able to put together in order to hopefully make your install easier.

First, to preface the documentation, here's a little bit of information about my setup. As I said, I'm running this on Ubuntu 12.04.1 server (64-bit), with PHP 5.3.10, MySQL 5.5.24, and Apache 2.2.22.

Following the guide mentioned above, I was able to get the basic installation working for the admin user. However, there were still issues with downloading, and also with any non-admin user. This problem is in the the permissions system of openupload. You will likely have to change it to accommodate your setup. You can do this through the site administration (Administration->Rights, then select the group you want to change), or else you can manually edit the MySQL table, but I'd suggest using the administration tool. To do it during setup, the INSTALL file mentions the different modes available. Pick the mode you want, and then go into the sql/txt/modes directory and find the corresponding permissions file. Copy that file (e.g. acl_restricted.txt) to your sql/txt/ directory, overwriting the acl.txt file. Otherwise you can modify the permissions after the setup using either method mentioned above. The bolded text is the table names that control the permissions. For the rest to make sense you may have to look at the table in MySQL first, and then you should get it:

acl

  • id - is the primary key and is set manually
  • module - refers to the modules of the site, with values of admin, auth, and files
    • admin refers to site administration privileges, which should only go to the admin group
    • auth is for allowing users to do stuff, or not to
    • files refers to the uploading, downloading, or deleting of files
  • action - sub-categories of the modules to be more specific about the permissions
    • admin actions
      • not sure. The admin group uses the wildcard * to allow all
    • auth actions
      • login - whether the group can login or not, which should always be allowed
      • logout - whether the user can click logout, which should be allowed
      • profile - whether the user can view and change his or her profile
      • register - whether a user can register themselves or not
    • files actions
      • d - whether the group can download files
      • u - whether the group can upload files
      • r - whether the group can remove files
      • l - whether the group can view their files. This enables the My Files tab
  • group_name - which group the acl pertains to
  • access - either allow or deny
The plugins work the same way. You can modify them in the administration settings (Administration->Plugins->Plugins ACL), or direct in the MySQL
plugin_acl

  • id - is the primary key and is set manually
  • group_name - specifies which group the acl pertains to
  • plugin - specifies which of the optionally installed plugins the acl pertains to
    • password - allows users to force password entry before file download is allowed
    • captcha - allows users to force a captcha entry before file download is allowed
    • email - allows users to have emails sent to notify of loaded files, removal links, etc
    • mimetypes - Allows restricting uploads by mime type by group
    • compress - another ?. I'm guessing this enables some type of invisible compression algorithm
    • expire - Enables auto-expiration of links after 30 days. Possibly automatic file deletion as well
If you know of other permission settings I'm missing, please add them in the comments. These are the ones I've figured out or found, and they were enough to get everything working as I need it to.

If you're using plugins, you may also need to change the related options. You can do that in the administration under Administration->Plugins->Plugins Options.

Now, you may also want to modify or rebrand the site to better fit the look you want. This is mentioned breifly in the documents, suggesting you create a new site template and change the configuration file to point to that directory rather than the default template, so it's up to you. Here are where files are located, with the directories based on starting in your web directory of the openupload install:


  • Openupload config file - contains all the main application settings (db information, site template and directory mappings, etc)
    • [openupload directory]/www/config.inc.php
  • Main page html file - in case you want to add links or anything to the main page
    • [openupload directory]/templates/default/index.tpl
  • Main CSS file - for modifying the background, alignment of objects, colors, etc
    • [openupload directory]/www/templates/default/main.css
  • Logo image - the logo that displays in the upper-left corner
    • [openupload directory]/www/templates/default/img/openupload.jpg
  • Email template - this is the default template used when sending notification emails
    • [openupload directory]/templates/default/plugins/email/notify.tpl
The [openupload directory]/templates/default/ directory contains the HTML pages for essentially the entire site.

If you run into problems uploading a file, make sure to check the upload size limits. This is set in the config file and your php.ini file. You'll want to make sure that the HTTP POST limit and the max upload size in your PHP settings match, and that the [openupload directory]/www/config.inc.php also matches that value. By default, the PHP settings allow only 2MB files.

I hope that helps you get started with openupload. There's also another one out there called FileZ, but I haven't tried using that and only mention it in case you want to look into an alternative that has more recent development.

Monday, June 25, 2012

Adobe Creative Suites 3 (CS3) auto update doesn't work

If you try updating your Adobe CS3 product and receive a message saying updates cannot be detected at this time, it's because of an expired security certificate. This cert comes with the CS3 install, and I don't believe Adobe issued a new one. However, to get around this problem, simply set your system time back to any date prior to 10/16/2011. Doing this will trick your system into thinking it's prior to the expiration of the security certificate, so the connection will be allowed. Make sure to set your system time back to normal though after getting your updates installed.

http://forums.adobe.com/message/4096973

Wednesday, June 6, 2012

SQL Server SSIS using Excel fails with DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER

If you're running a 64bit SQL Server, you likely are also running the 64bit version of Business Intelligence Development Studio. Unfortunately, the 64bit version of BIDS doesn't work with the Excel Connection Manager. Luckily though, you can still use it to create your SSIS package and getting the package to run is as simple as checking a box.

If you're running SQL Server 2008 and using a SQL Server Agent job to execute your SSIS package that contains your Excel connection, here's how to get it working:

1. Go into the Job Properties
2. Select the step containing your SSIS package and click Edit
3. In the step properties, go to the Execution options tab and check the box labeled "Use 32 bit runtime" (see screenshot)


This will force the SQL agent to use the 32 bit runtime when executing the package, and that supports the Excel Connection Manager.

If you're using SQL 2005 of DTEXEC to execute your package, check out the original post I found that helped me.

Thursday, May 17, 2012

Replace LCD Screen or other parts on an HP 2000 notebook

I was going to take pictures and document the process to replace an LCD screen on an HP 2000-216NR notebook computer, but found HP's service manual online and it does a good job of doing the same thing. You can find it here: http://h10032.www1.hp.com/ctg/Manual/c02753308.pdf. It's the full service manual, so it has the steps to take apart and replace just about every component you'd want to. If you're specifically replacing the LCD screen rather than the entire display, you may want to check out this article too: http://www.insidemylaptop.com/replace-broken-screen-on-hp-2000-laptop/. I used a combination of both to successfully replace the LCD

Wednesday, May 2, 2012

ApplicationException: Access is denied when using EPPlus or OfficeOpenXml

I have a basic site setup with reports that can be exported to Excel 2007 .xlsx formatted files. I was running into an issue with one of the larger reports where it would work for a while and then crash with an "ApplicationException: Access is denied" error. Running the same procedure on my test box worked fine though, so I had to scratch my head a little. Luckily Google searching came through again and the fix is very simple. I found it here: http://excelpackage.codeplex.com/workitem/17586

On your web server, create a folder named IsolatedStorage in C:\Documents and Settings\Default User\Local Settings\Application Data\. Then make sure to give the user setup on your ApplicationPool read and write privileges on that folder. If you don't know which user that would be and are running IIS 6, go to your Application Pools in IIS Manager, right-click the pool you want to check and go to Properties. Under the Identity tab you'll find the account name.

Wednesday, April 4, 2012

Updating BIOS or firmware on Dell server running VMware

I tried using the OMSA liveCD from Dell, but it locked me out of the network preferences and wouldn't pull DHCP, so I had to come up with my own workaround. They failed to mention the need to install a package in order for the updates to run too, so I decided I'd put together a quick walkthrough for anyone else with the same problem, and for me to refer back to the next time I need to remember which package it is that I need.

First, you'll need a bootable live disk of an OS. If you want to make a Windows version for yourself, feel free, and I'm going to imagine you wouldn't be reading this if you already knew how to do that. Otherwise, I've used CentOS. You can check it out at www.centos.org, and visit their downloads section to find a mirror to download an .iso file to burn to disc. I'm currently using v6, and it has worked well on the machines I've needed to update. If you're not familiar with Linux, it's ok, I'm not that great with it myself. There's nothing too technical to these updates though.

If you want to download the updates directly on the server, reboot your server and boot from the CD to load up CentOS. If you have to download the updates onto a USB flash drive, make sure you plug the flash drive into the server before booting from your LiveCD. If you plan to download the updates directly the system should pull in an IP address via DHCP. If you don't have DHCP configured on the network the server is connected to, go to System->Preferences->Network Connections, select the Ethernet card you want to configure, and then change the IP settings to what you need them to be.

Now you can open up Firefox to get to the Dell website to download the updates. Make sure you download the .BIN files that are for updating in Red Hat. You may have to use the "Other download formats" link on each to get to the .BIN option, but they're there. Skip to the next step if you're using a flash drive.

Open up the Terminal (Applications->System Tools->Terminal) and navigate to where your files are using cd /path/to/files/. If you have a USB device plugged in, you should be able to open it on the desktop to see the path (probably /dev/something).

Once you're in the folder the updates are in, run sudo chmod +x *. This will make all the .BIN files executable. After you've done that, you need to make sure one of the dependencies is loaded. I suppose this means you'll need a network connection, unless you can load this onto your USB device and pull it from there as well. To install the required dependency from over the network, use sudo yum install compat-libstdc++-33*. Then follow the prompts to install the latest version of that program on the machine.

Once you have the dependency installed, now you can start installing the updates. To do that use the command sudo ./updateFileName and press Enter. If you want to verify the update first, add a -c at the end. The regular update runs the same thing though and gives you a chance to cancel, so the -c is a little overkill in my opinion. Follow the prompts to get your updates installed, then reboot and get your VMs back up and running.



Monday, April 2, 2012

Allow standard users to update UPS WorldShip

I have a user who uses UPS WorldShip 2012 daily. I didn't want to give the user local admin rights on the computer, but WorldShip updates automatically and it happens more than once a week. Those updates require admin privileges to run, so I did some digging to come up with a workaround. It turns out it's possible, and is pretty easy to setup.

First, run RegAccess.exe in the UPS install folder (the standard path is C:\UPS\WSTD\RegAccess.exe) using an account with admin rights on the machine. It only took a few seconds to run.

Next, you'll need to download the Microsoft Application Compatibility Toolkit, which you can get from here. You want the ApplicationCompatibilityToolkitSetup.exe file. Install that on the machine running WorldShip, and then start the Compatibility Administrator (32-bit) program with an admin account.

From here you'll want to check out this article for instructions on how to use the toolkit. For the name of the program to update, use runpatch.exe. The vendor is UPS (Or whatever you'd like. It doesn't really matter). For the program file location, find the runpatch.exe file in the UPS folder (default is C:\UPS\WSTD\runpatch.exe).

After you run through the setup to check the RunAsInvoker option, choose save as and it will ask you for the database name. This will be the name of the sdb file you need to import, and in their case they used uac-whitelist in the example. I saved my sdb file in the C:\UPS\WSTD folder to keep it all together, but you can save it wherever you'd like.

Once you have your sdb created, you'll need to open a command prompt running with admin rights. From there you can enter sdbinst C:\UPS\WSTD\uac-whitelist.sdb and press Enter. If you saved your file somewhere else, use your path instead, and the same for the sdb file name. That will import your sdb file into the system so you no longer need admin privileges in UPS WorldShip to update the program

Tuesday, March 27, 2012

OS X mobile user is not recognized as an admin

I've run into the problem more and more as we move our Mac users to laptops. I have an Active Directory security group setup for them, and assign that group admin rights in Directory Utility when binding the Mac to the domain. However, if the Mac cannot access a domain controller and instead uses the mobile credentials, that user does not have admin privileges until the next time they login while connected to the network.

You'd think this would be a simple thing for Apple to include, but at least there is an easy workaround. To get around this:

1. Login as the domain user if you haven't already. This will create a mobile profile for them on the machine
2. Shutdown
3. Disconnect from the network
4. Boot up and login using a local admin account
5. Open System Preferences->Users & Groups, and unlock it if necessary
6. Click on the domain user from step 1 and check the box marked "Allow user to administer this computer"
7. Close and reboot

If you want to test this, leave the network disconnected and login with the domain user account. You should now have admin privileges on the machine in your mobile account. I don't know why I didn't think of this sooner, but thanks to this article for the resolution.