SYS-CON MEDIA Authors: Liz McMillan, Elizabeth White, Maria C. Horton, Andy Thurai, Zakia Bouachraoui

Related Topics: PowerBuilder

PowerBuilder: Article

Harvest R7 – R12: Performing an Upgrade Process

A checklist of activities to be aware of when performing the upgrade process

This article will focus on the upgrade process from AllFusion Harvest Change Manager R7.1 to CA Software Change Manager (SCM) R12.0.2. I am writing this article because I recently went through this process and felt it would be beneficial to share this experience with other users in the field. We ran into some surprises and I wanted other users to be able to use this article as a checklist of activities to be aware of when performing this upgrade process.

This process can be pretty straightforward if the upgrade only concerns upgrading the repository data alone. However, there is a lot of overhead in this process if you are using Forms, Attachments to Forms, User Defined Processes (UDPs), Email notifications, Encryption of passwords, Enforce Package Bind Flags, Verify Package Dependency Flags, Canned Reports or Customized reports. The project that I worked on to upgrade used many of these processes, which made the upgrade progress more labor-intensive as a result. The team that was experiencing this upgrade used Harvest very aggressively and took advantage of the power and robustness of this tool, which in turn enhanced the information available to the team related to this project in a central repository.

Presently, I am the administrator of SCM AllFusion Harvest R12. SCM AllFusion Harvest is a process-based Software Configuration Management (SCM) tool for managing application source data assets. I manage 198 applications housed in SCM AllFusion Harvest and support 163 users using the product. The development tools we currently use in our development environment are PowerBuilder PBV8, PBV11; Visual Studio 2003, 2005, 2008; Visual Basic V6.0, IBM RAD V7.5.5 and Eclipse Europa.

As the Software Configuration Manager (SCM), I provide the administration of the source code management tool. This includes creating the Harvest environments, developing life cycles, environment phases, processes, users, user groups, level of access to Harvest environments, loading repositories for archival purposes, documentation of software methodologies; maintaining build machines; providing best practices and training all users on proper source code management using the development tools in our environment.

Every Software Configuration Management tool is different in terms of functionality and navigation; however, they all have common threads of functionality that are consistent with all Software Configuration Management tools. Common activities include checking out, checking in, adding new files, deleting existing files, obsoleting files and loading an initial baseline of source code. The way these tasks are achieved is of course different from tool to tool but many of the SCM tools perform these basic activities. I prefer SCM tools that have a relational database behind them for security, disaster recovery, retrieval and storage capability.

One of the first big differences that I noticed right up front was that the form is now stored as a table in the Oracle database in SCM 12, whereas in Harvest 7.1 it was stored on a shared drive or some other location where users could set the path and point to it from within the Harvest Workbench. Figures 1 and 2 illustrate the process by which you set the path for where your form will be located to allow users to point to this location to view the form from within Harvest 7.1. In SCM 12 the .hfd is used and the formgen.exe is run against the .hfd to generate a .sql, .htm and .xml file and then the hformsync command is run against the .hfd and the .xml content is then uploaded in a table in the SCM 12 database.

In this case we are using Oracle 11g. This is a nice change from AllFusion Harvest Change Manager R7.1. This process safeguards the form from changes on some shared drive location and it provides efficiency to the form for retrieval and storage of the data within it now that it is stored in a database table. Now there is no need to have a shared drive location and the form files populated there for users to point to. This also precludes changes to the .htm or .hfd and circumventing the process by which the data is synched up in the database. In the past, with the form files located in a place where folks can get to it, there is the temptation to change these files on the fly to reflect certain pieces of data without that data getting into all the form files; you can run into sync issues later on with regard to the form and an upgrade of the product. This new process takes away that temptation and stores the data in the database.

When we did this upgrade a new server was purchased to house a new installation of SCM R12. We installed the SCM R12 server software, the SCM R12 Client and Oracle 11G client on this server and pointed to a Unix platform. Once we created our OBDC connection and created the new schema, we used a database export from our AllFusion Harvest Change Manager 7.1 server to import the data into the new SCM R12 database.

Before we ran the Hdbsetup.exe command I ran the Hsysreport and Hdbanalyze commands, which provides information about the database and ensures that there are no errors existing before we run the Hdbsetup.exe. You want to make sure that you are importing error-free data into an error-free database (Garbage in Garbage out). Once this was complete we ran the CA Hdbsetup.exe to upgrade the instance and data from Harvest 7.1 to SCM R12 using the commands: Upgrade SCM Repository (UR), Load Projects (LP) and Load Forms (LF). These commands will upgrade all repository data, project data and form data with the latest update of whatever export date you acquire.

Once these actions were complete, I then ran the Formgen.exe against the latest .hfd file, which created the latest .sql, .htm and .xml files. I then created a directory at the root of the data drive on the server simply called "Forms" and I copied all four of the form files to this location. I had to do this because I had a difficult time navigating to the SCM home directory. This is a Windows 2008 server and there were some unique issues related to this operating system that had to be overcome. Once the forms (Test Form.hfd, Test Form.htm, Test Form.sql and Test Form.xml) were populated to the new directory at the root of the data drive, I then ran CA's Hformsync command in that directory. Below is the command that I used to sync up the form and the data to the database:

Example:  hformsync -b "server name" -usr "xxxxxx -pw xxxxxx" -d "e:\forms" -hfd -f "name of form.hfd"

When this runs successfully the following output message is generated to confirm whether the Hformsync was run successfully or had any failure(s) associated with it.

Example:  I00060040: New connection with Broker "xxxxxxxxxx"  established.
Problem Report form : processed sucessfully.
Number of Files Updated in DB:1
Form synchronization has been completed successfully.

The form in SCM 12 has new navigational processes that are different from Harvest 7.1. For example, the attachment to the form is now added directly under the form in the tree view rather than as it was in Harvest 7.1 - a small paper clip was at the bottom of the form and you could attach files to the form this way. Now you do a right-mouse click on the form and you can add a file to the form this way. There is a details report now of the form that is very useful and can be printed. Also, you can now open up multiple forms at once from the find form view by clicking on the edit form and then close the find form view and the forms are all available to view; if you click on the (X) button on the top of the form you can then close each one as they are viewed. Figure 4 illustrates the paper clip in the bottom of the form and Figure 5 illustrates the right-mouse click commands that are now available on the form in SCM R12.

Figure 6 illustrates the form search. Once the forms have been located you can use the (Ctrl) key on your keyboard to select the form(s) that you want to view/edit, do a right-mouse click and select (Edit Form), and the forms will individually get populated on the right-hand pane of the SCM Workbench for viewing. You will have to close the Find Form search screen to view the forms which are located behind the search screen. Figure 7 illustrates on the top of the menu bar the forms that are open for viewing. When you have completed reviewing one form you can click the (X) to close the form and begin viewing the next form.

When we perform the database export it takes approximately one hour to complete. We shut down the Harvest brokers while this process is being done. We typically do this process in the evening without impacting users during the day. This is to make sure that no one is accessing the database while the export is being conducted. We have found that running an export during the day slows our network performance down greatly and impacts the user's ability to perform Harvest SCM activities. When a full export is available, it's then sneaker netted to the location where we want to use the export to import it into the instance of the SCM 12 database.

Once the commands have all been run and the data has been successfully updated, it's now time to test to ensure that the data that came over as part of this upgrade process is accurate and that all processes work as expected; all users can log into SCM R12 Workbench; repository data is accessible and accurate; form data is available and accurate; form attachments are accurate and can be viewed; User Defined Processes (UDPs) are available and accurate; Package Binding and Dependency flags are in place and functional; promotions and demotions are working in all Harvest States; and perform test checkouts and check-ins to ensure data can still be processed.

Once the verification testing is complete and you're confident of the accuracy of all other processes and data assets, it's now time to update your build script and perform a build and produce a solid executable. By solid executable I mean a verifiable build that reflects production in the field. The source data assets are accessed via the Harvest HCO command and data is acquired via the new Harvest server repositories.

When we began this process of updating and running our Ant script we were unaware that encryption was being used during the build process. This was discovered during the running of our Ant script after we attempted to acquire browse read-only source data assets from the new SCM R12 Harvest server, Harvest environment, Harvest State and Harvest repository and it produced the following error message:

The error is produced when using the -eh command option using the hco command in the Ant build script.

HCO command we use:  hco -b "server name" -usr "xxxxxxxx" -pw "xxxxxxxx" -vp \Test_Code_Project -en Test_Code_Project -st "ACCEPTANCE TEST" -pn "CHECK OUT FOR BROWSE/SYNCHRONIZE" -cp c:\Test_Project\ -br -r -op pc -s "*.*" -o output.log

ERROR: Please encrypt the password file..\lib\xxxxxx.txt with new svrenc utility

Refer to page 161 of the CA software Change Manager Command Line Reference Guide.  See error above that refers to the NEW svrenc utility.  When this error is received you need to run the below command to create a new encrypted user/password file.  The name of the new encrypted file needs to be xxxxxxxx.txt.

svrenc Command-Encrypt User and Password Credentials to a File

The svrenc command is a user name and password encryption utility that stores encrypted credentials in a file, which can then be used by the:

  1. CA SCM Server to obtain the database user and password
  2. CA SCM Remote Agent (LDAP support)
  3. CA SCM command line utilities to obtain the CA SCM or remote computer user and password

This command has the following format:

svrenc {-s | -f filename} [-usr  username] [-pw password] [-dir  directory_name] [-o filename | -oa  filename] [-arg] [-wts] [-h]

-s   (Required: -s and -f are mutually exclusive and one is required.)  Specifies that the encrypted credentials are saved in a hidden file named:  hsvr.dfo in <CA_SCM_HOME>; which is then utilized by the CA SCM Server processes when connecting to the Database Server.

-f filename (Required: -s and -f are mutually exclusive and one is required.)  Specifies that an encryption file be created with the file name you provide.

If -f filename is specified, the encrypted credentials are saved in a hidden file named: filename, which can then be utilized by Remote Agent LDAP support or command-line utilities.

Once we ran the following command, the newly encrypted file was saved on our build machine and the Ant script built successfully as expected.

Example:  svrenc -f password.txt -usr xxxxxxx -pw xxxxxxxx -dir C:\Build\scripts

This encrypted file did not exist on our new build machine and had to be re-created using the command above. When it ran it did create the new encrypted file, which contained the encrypted user name and password to run the Ant build script.

When we perform back-ups or database exports on the Harvest server, we shut down all the services. This includes the Harvest Brokers, Oracle Database and Apache Tomcat. We developed automated scripts that run on a schedule every evening and perform the shut of the services, and when the backups and database export is complete the automated scripts bring all the services back up as well. We updated these automated scripts for the new Harvest server and tested them to ensure that the backups and exports were running successfully.

The export that we used to import the projects from the Harvest 7.1 server contains Harvest environments that are not required on the new SCM R12 server. These projects take up resources (RAM and Hard Drive Space) that are unnecessary on this server. CA provided us with an Hsysdelete command that allows the deletion of project/repository data from the Harvest SCM R12 database. Below is the script used and provided by CA to accomplish the deletion task of unwanted Harvest environments:

hsysdelete -n "xxxx" -u "xxxxxx" -p XXXXXXXXX "Repository Name"

One of the last tasks of this upgrade was to install the Business Objects application to make the 44 canned reports that come with SCM R12 available. Managers and users will find these reports very useful for reporting on SCM activities by user(s) or projects.

Our Experience
We develop and maintain more than 198 applications at New Hampshire's Department of Information Technology. The applications are used extensively in our welfare and health services delivery agencies. Example applications are for child-care licensing and managing adult and elderly care. Throughout the state the applications are used by hundreds of users.

My synopsis and review of the SCM R12 upgrade process goes as follows:

As I've stated earlier this process is very straightforward if all that is involved is the updating of repository and project data. However, there is a lot of overhead in this process if you are using Forms, Attachments to Forms, User Defined Processes (UDPs), Email notifications, Encryption of passwords, Enforce Package Bind Flags, Verify Package Dependency Flags, Canned Reports or Customized reports. The project that I worked on to upgrade used many of these processes, which made the upgrade more labor-intensive as a result. The team that was experiencing this upgrade used Harvest very aggressively and took advantage of the power and robustness of this tool, which in turn enhanced the information available to the team related to this project in a central repository. Again though there was a lot more to look at and attend to in terms of making this upgrade go off successfully.
The more moving parts the more possibility for issues to arise. The good thing is that CA has a great technical team that is very helpful. When issues arise, they have the resources to tackle them and help you to be successful with your implementation.

Please feel free to contact me should you have any questions regarding the product (CA SCM AllFusion Harvest) and its use in our environment with various development tools.

More Stories By Al Soucy

Al Soucy is software configuration manager at the State of New Hampshire's Department of Information Technology (DoIT). In that role Al manages software configuration for dozens of PowerBuilder applications as well as applications written in Java, .NET, and COBOL (yes, COBOL). Al plays bass guitar, acoustic guitar, electric rhythm/lead guitar, drums, mandolin, keyboard; he sings lead and back up vocals and he has released 8 CDs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
According to the IDC InfoBrief, Sponsored by Nutanix, “Surviving and Thriving in a Multi-cloud World,” multicloud deployments are now the norm for enterprise organizations – less than 30% of customers report using single cloud environments. Most customers leverage different cloud platforms across multiple service providers. The interoperability of data and applications between these varied cloud environments is growing in importance and yet access to hybrid cloud capabilities where a single appl...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundat...
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Sold by Nutanix, Nutanix Mine with Veeam can be deployed in minutes and simplifies the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting. The offering combines highly-efficient storage working in concert with Veeam Backup and Replication, helping customers achieve comprehensive data protection for all their workloads — virtual, physical and private cloud —to meet increasing business demands for uptime and productivity.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...