BCS Socialize Links In Link Repository

https://bcsjava.com/fws/soln

The paradigm for URLs is different from it predecessor HTML. None of the “normal” approaches applies anymore.

The two main components of the URL is the name of the link and the link itself. A third component of category was added which gives a hierarchy of links. This information was stored in a RDBMS and is a handy way to manage URLs.

Unique categories are sorted and appears in a list for the user. When the user clicks on the category another list appears containing links sorted by link name based on the selected category. When the link name is double clicked a page is launched with all social link engine entries. Traverse the list until the desired service comes into view and then double click.

Mr. Arch Brooks, Software Engineer

BCS Link Capture Utility Update 001

https://bcsjava.com/fws/doc/linkstodb

The BCS Link Capture Utility has been updated.

The component documentation was replaced.

The source code was modified to include functionality for usage of the edit key! Functionality was also included to prefix commands with the default http://. Prefixes accommodated are http:// https:// ftp://.

The desired effect was to provide enough functionality to make the capture of the desired information. Of course a complete web browser software using the TWebBrowser component would be an extensive project unto itself.

Delphi 7 ADO Code Generation Techniques

Typically to author a code generator for Delphi 7 that employs ADO is to have a file naming scheme that will accommodate the main line dialog the family id suffix is ml. The data module family id suffix is dmU. The component installer its family id suffix is pk. A component driver program with a family id suffix of dp. And finally the component tester and its family id suffix is cmpml. Any workbench units would have a family id suffix of wbU. Typically a batch file that would copy associated forms and Delphi complied units or dcu to a central repository. This approach lends itself to accommodate infinite components but maintain availability for seamless integration into multiple projects. A single entry int the IDE environment make all available component for use and reuse by any number of projects.

A set of skeleton programs that comprise the data grid, memo place holder and reporting capability would be coded and used for subsequent code generation. This approach allows for uniformity of the source code and its associated documentation.

The family id prefix would identify the overall ID of the targeted source code. Possible naming conventions could contain system id, table name combinations as well as other identifying naming schemes. The overall desired effect is to allow the technician to know which component they are using by just a glance at its name.

BCS Link Capture Utility

https://bcsjava.com/fws/doc/linkstodb

This project capture the URL and title of web pages via an embedded web browser.

The BCS Link Capture Utility provides all the necessary communications to
allow the user to surf the web using a built in web browser. Upon
successful navigation to the desired web page the URL and the page title
are captured and stored into a MySQL table.

This utility was created to capture and store projects on SourceForge. The
title is modified to remove text after the pipe symbol. Of course the pipe
symbol is also removed. For applications that do not include a pipe the
applications should be modified to relax the pipe adjustment.

Use the data grid to delete any captured records that should not be
included.

The main advantage to this approach is the data entry requirement is
eliminated. Also the copy and paste requirement is eliminated.

Strategy For Flash Builder Code Authoring

When authoring code for Flash Builder projects components should be employed. With each function a component should be initiated. This approach allows for maintainability of the code itself. If you choose the monolithic route the code becomes unmanageable in a very short time frame.

Mr. Arch Brooks, Software Engineer

COBOL Source Code Generation

Producing COBOL source code does not require hours of head down programming. Automation of the entire coding process is easily achieved. COBOL has been around for quite a number of years. Its use and popularity has given way to some of the newer languages. COBOL’s utilization is further enhanced by employing sub programming techniques that rely on the Linkage Section of COBOL.

COBOL source code generation is easily accomplished by using a symbolic stream generator with keyword replacements. To accomplish the COBOL source code generation first create an error free COBOL sub program containing the desired COBOL functionality. As an example we would have a COBOL program to operate on fixed length 80 column images. This technique is also known as Physical Sequential File Access. We then highlight various key portions of the COBOL source code so that they may be tokenized and marked for replacement by and intermediate code processor.

Typically the COBOL sub program would accommodate input as well as output thus allowing for one set of code to be used for both input and output of information.

The “RECORD CONTAINS 80 CHARACTERS” clause is changed to read “RECORD CONTAINS CHARACTERS”. The “” phrase becomes the token and by replacing it with the desired record length of 121 yields a source code program that will operate on logical records with a fixed length of 121 characters.

The FD name of the file description could be tokenized as and could be replaced with a desired descriptive name. This process is repeated for as many of the key elements of the COBOL sub program as required that yields the desired code flexibility.

Simply read the skeleton source code into memory and make the necessary replacements of the identified tokens and then save the updated source code to a unique file specification. Afterwards simply compile and link the newly created COBOL source code to yield the newly defined COBOL source code which meets the desired specifications. If the compiler resides on a remote computer simply File Transfer Protocol the source code to the target platform then proceed with the compile and link step. The majority of development is accomplished on the PC and the results are then uploaded to the mainframe for subsequent utilization.

Any processor or environment with the ability or change and replace the identified tokens can be used as the code generator.

Pascal my programming language of choice so much of the code generation work is performed on the windows platform then uploaded to the IBM Mainframe platform. Please keep in mind that I do have access to several PC based COBOL compilers.

Mr. Arch Brooks, Software Engineer, Brooks Computing Systems authored this article.

COBOL Array Manipulation

COBOL features are inherent that perform array manipulations quire efficiently. Typically you use integer variables as indices. Then the Perform Varying with the after option is used to iterate through the array. COBOL features are inherent that perform array manipulations quire efficiently. Typically you use integer variables as indices. Then the Perform Varying with the after option is used to iterate through the array. Many COBOL programmers are not aware of these available feature let alone how to employ them in problem solving.

While serving as an Airline Executive for Trans World Airlines (New York & Kansas City) part of my duties assigned was to manage flight crews for every flight TWA flew (domestically) for a month in advance. That task alone drove many COBOL programmers beyond the brink! But I was not one of them.

When I was assigned to the project I told my supervisor we (me, myself and I) eat this stuff up (WETSU)! Another caveat to the application was that each flight had to recognize the change in time due to usage of daylight savings time. Almost every state adheres to daylight savings time except Arizona and a couple of other places. The project was an overall success to the point my management then turned over all the international flight and told me to do the same thing for all international flights. I said no problem!

When I became the Project Officer for Crew Scheduling approximately one hundred twenty five COBOL programs had the time tables hard coded throughout all the source code of the application. When the application ran it consumed about nineteen hours (wall clock time) to run from beginning to end.

Closer scrutiny of the problem revealed a large portion of the wall clock tine involved was consumed allocating and de allocating the same few files repeatedly.

Being a pioneer of thinking outside the box I examined the code executed and figured a way to allocate each file only once. Then all the code was organized as sub programs and every subprogram that access information for that file was called in the appropriate sequence. This process was repeated until all the data used for the generation of the operational schedules was accessed and utilized. The overall end result was a complete set of operational schedules in a fraction of the time normally required for successful completion.

The nineteen hours turned into only twenty minutes for successful execution. My work was duly noted and I received a cash incentive award of several thousand dollars for the re design of the system and performance improvement.

My next task was to eliminate the need to change the source code of the application twice a year. The twice a year changes covered the spring forward fall back scenario of daylight savings time.

A matrix or table was designed to hold each originating flight departure time followed by each subsequent flight segment departure time. Another leg of the matrix housed the arrival times for each flight segment. That comprises two dimensions of the multi dimensional array designed to accommodate each flight flown by TWA for a month in advance. In the spring forward scenario one hour was added to the departure and arrival times respectively. Of course in the fall back scenario one hour was subtracted from the departure and arrival time respectively. Other matrices accommodated the four time zones for the continental United States. Still other matrices accommodated the fifty states.

Beginning with the outermost matrices the iterations began while the indices were decremented until all matrices were accommodated. The perform varying after option was used for the most frequent number of occurrences in each matrix. Of course by the final iteration the flight schedule for EST, CST, MST and PST were changed to Daylight Savings Time and vice versa. The table was the only code that changed the time required to re compile those one hundred or so programs was eliminated also. Now that I think about that project that was approximately thirty one years ago!

Please feel free to contact me at my web site below for further assistance in this as well as other endeavors. Simply click on the free consultation link at the top of every active page on the web site.

Mr. Arch Brooks, Software Engineer

COBOL Debugging

COBOL programming is one of the most sophisticated and comprehensive programming environments in existence. There are many approaches to troubleshooting COBOL source code. In infancy the COBOL programmer typically places display statements in the program to sort of track the processing path of the application. The results of the display statement would appear on a system output device usually standard system printed output queue. To examine the contents of a specific part of data more source code was created just to handle such a situation.

The other option would be to interpret core dumps or system dump COBOL programming is one of the most sophisticated and comprehensive programming environments in existence. There are many approaches to troubleshooting COBOL source code. In infancy the COBOL programmer typically places display statements in the program to sort of track the processing path of the application. The results of the display statement would appear on a system output device usually standard system printed output queue. To examine the contents of a specific part of data more source code was created just to handle such a situation. The other option would be to interpret core dumps or system dumps. This is also a daunting task because the programmer had to understand exactly how IBM, Sperry or Honeywell processed application programs once loaded into memory. To give a brief example the IBM operating number systems were binary and hexadecimal (base 16). The Sperry and Honeywell operating number systems were binary and octal (base 8). My head is hurting already. Now in the IBM environment you have to know how to locate various pointers. As an example, to determine the offending computer instruction you had to know the base address of register fourteen. At displacement of register fourteen plus twelve words forward (offset of twelve words) is the offending instruction. Another example is that the beginning of working storage was and offset from BLL cell number eight. If this all sounds like Greek it pretty much is. The Sperry architecture employs a separate strategy of registers and locating specific instructions in a COBOL program. The Honeywell architecture employed a more bizarre strategy for retrieving information from a core or system dump. I was never more relieved when I learned to effectively use the COBOL debug facility.

The COBOL debug facility allowed for debug packages (a specific structured syntax) that would identify specific lines of COBOL source code as they were executed. The data used in the instructions as well as the instruction itself would be dumped to the system output queues in a format that makes sense to a COBOL programmer. This approach saves paper because instead of dumping a snapshot of the entire memory map (usually several hundred pages) only the pertinent areas of memory are dumped (usually one page). Learning COBOL is labor intensive and time consuming.

Learning many of the idiosyncrasies of COBOL and the finer aspects of using COBOL is more labor intensive and requires even more time. Please feel free to contact me at my web site below for further assistance in this as well as other endeavors. Simply click on the free consultation link at the top of every active page on the web site.

Mr. Arch Brooks, Software Engineer

COBOL

The language continues to evolve today. In the early 1990s it was decided to add object-orientation in the next full revision of COBOL.

The initial estimate was to have this revision completed by 1997 and an ISO CD (Committee Draft) was available by 1997. Some implementers (including Micro Focus, Fujitsu, Veryant, and IBM) introduced object-oriented syntax based on the 1997 or other drafts of the full revision.

The final approved ISO Standard (adopted as an ANSI standard by INCITS) was approved and made available in 2002.