SQLite and a Node API

my previous article, we went through how easy it was to call a Twitter API, searching for a string and storing that tweet in a SQLite database stored on the IBM i using Node.js. Before we get around to analysing the records we have stored over the last month or so, we need a method […]

The post SQLite and a Node API appeared first on PowerWire.eu.

Extract a portion of a Date/Time/Timestamp in RPGLE – IBM i

%SUBDT

Extracting Year, Month, Day, Hour, Minutes, Seconds or Milli seconds of a given Date/Time/Timestamp is required most of the times. 

This can be extracted easily by using %SUBDT. BIF name looks more similar to %SUBST which is used to extract a portion of string by passing from and two positions of the original string. Instead, We would need to pass a value (i.e., Date, Time or Timestamp ) and Unit (i.e., *YEARS, *MONTHS, *DAYS, *HOURS, *MINUTES, *SECONDS or *MSECONDS) to %SUBDT. 

Valid unit should be passed for the type of the value passed. Below are the valid values for each type.
Date – *DAYS, *MONTHS, *YEARSTime – *HOURS, *MINUTES, *SECONDSTimestamp – *DAYS, *MONTHS, *YEARS, *HOURS, *MINUTES, *SECONDS, *MSECONDS

Syntax:

%SUBDT(value : unit { : digits { : decpos} })

Value and Unit are the mandatory arguments. Digits and Decimal positions are optional and can only be used with *SECONDS for Timestamp.

We can either pass the full form for the unit or use the short form. Below is the mapping of Full form to short form. 
*YEARS – *Y*MONTHS – *M*DAYS – *D*HOURS – *H*MINUTES – *MN*SECONDS – *S*MSECONDS – *MS

Let’s have a look at couple of simple examples to see how this works. 

E.g.:

Extract Date, Month and Year from a Date field. 

In the above example, 

We are using two different date formats *ISO & *DMY which shows different number of digits for year (*ISO – 4 digits and *DMY – 2 digits). However, %SUBDT would always return the full year (4 digits) irrespective of the format of the date passed. Line – 17: %SUBDT with *ISO date format. Line – 22: %SUBDT with *DMY date format. We don’t necessarily pass the date in a variable. Lines – 17 & 22: We are passing the date variables to extract the year. Line – 27: We are passing %DATE() to extract the month from the current date. Line – 32: We are passing the required date to extract the day from the passed date. 


Let’s have a look at another example to extract Hours, Minutes and Seconds from a Time. 

In the above example, 

Line – 17: We are using *HOURS full form for value instead of short form *H. Using any of these two would return the same result. Either a variable with time, %TIME() or a time literal can be passed to %SUBDT. Line – 15: A time variable is being passed to extract hours. Line – 20: %TIME() is being passed to extract minutes. Line – 25: Specific time can be passed directly by using time literal ‘t’. We are extracting seconds from the time passed. 


Let’s have a look at the final example to extract the milli seconds by passing the Timestamp. We will also explore the two optional parameters in this example. 

Extracting Years, Months, Days, Hours and Minutes from Timestamp is same as extracting from Date and Time. Milliseconds can be extracted from Timestamp and optional parameters can be used on seconds to specify how many decimal positions are to be extracted for milliseconds. 


In the above example, 
Line – 14: Extracting milliseconds from timestamp. Line – 19: Extracting seconds from timestamp along with milliseconds by specifying optional parameters digits and number of decimal positions. We are passing digits (total) ‘6’ and decimal positions ‘4’. %SUBDT would only return milliseconds up to 4 digits along with seconds. wSeconds variable has been defined with 6 decimal positions, so the last two decimal positions would be stored as ’00’. If the third parameter (digits) is specified, Fourth parameter (decimal positions) must be specified. And, Fourth parameter should always be 2 less than third parameter (2 digits for seconds).If optional parameters aren’t specified, no milliseconds would be returned. 



If you have any Suggestions or Feedback, Please leave a comment below or use Contact Form. 

How to Power AI-Driven Analytics with IBM Z and IBM i

Real-time analytics are playing an increasingly important role in today’s midsize and large enterprises. Whether your focus is operational data analytics, service intelligence, or security information, it’s important to have systems in place to monitor, analyze, and report information. IT data analytics is the foundation stone upon which IT excellence is built.

Splunk is a leading platform for IT analytics. Unfortunately, if your company is running IBM i or IBM Z systems, you need to improve the observability of these systems in Splunk. Although IBM systems serve as a backbone for many core applications in a business, the operational information about performance, security, and technical operations of these systems, remains inaccessible by Splunk.

IBM Z, for example, includes the System Management Facility (SMF), but that information remains inaccessible to Splunk in its native format. When IT teams cannot see how IBM Z platforms are performing, service disruptions can take a longer time to resolve or even cause expensive downtime in discovering the issue.

The good news is that there are viable options for bridging the gap between your IBM systems and IT data analytics (whether you are using Splunk or another tool). Let’s look at some scenarios where this can add value to your IT operations strategy.

Read our eBook

Splunk and the Mainframe: 6 Real-World Case Studies for ITOA, ITSI and SIEM

To learn more about Ironstream and explore real world case studies for ITOA, ITSI, and SIEM where new technologies can provide answers to the questions challenging organizations, read our eBook.

IT service intelligence (ITSI)

ITSI is about mapping and monitoring an organization’s IT environment to identify potential problems, prioritize issues, and resolve incidents promptly. In addition, ITSI is used to track Key Performance Indicators (KPIs) in order to ensure that service levels are meeting the expected standards.

A luxury auto dealer network serving the Americas was running CICS and Db2 apps on its IBM Z, which interacted with dealer IT systems across the continent. The system was experiencing significant performance problems, but company IT personnel were entirely unaware of those problems until a dealership complained directly to the company’s CIO. The organization’s existing ITSI systems had no direct visibility to the mainframe’s performance logs. Because its IBM system was, in effect, walled off from its distributed systems, the company’s existing IT analytics system had no awareness of the problem whatsoever.

The auto dealer network rolled out Precisely Ironstream, along with the Ironstream Module for Splunk IT Service Intelligence. That provided immediate visibility to the relevant information from the IBM mainframe, giving the organization’s IT personnel a complete picture of its entire IT landscape for the first time ever. As a result, the company was able to reduce the mean time to resolution (MTTR), speed the flow of information, and increase the satisfaction of the dealers whom it served.

IT operational analytics (ITOA)

While ITSI is focused on identifying, prioritizing, and resolving issues, ITOA is about informing operational decisions about an organization’s IT landscape. Once again, significant challenges may arise when technology leaders within the company have an incomplete picture of that landscape.

A global financial services firm, for example, had identified some high-priority issues on one of its IBM Z. The firm was seeking to better understand its options for maintaining operational efficiency and controlling costs.

One particular job, for instance, was consuming 30% more CPU time than was normal. That issue was typical of the problems the firm was seeing elsewhere in its IT landscape, and it was costing the company time and money. The organization’s mean time to resolution was significantly longer than it should have been, and its existing systems could not alert the team promptly to issues in various business units across the company. Perhaps most importantly, there was no connection between the organization’s IBM Z and its distributed computing systems.

To make better operational decisions, the company knew it needed to have a complete picture of its IT landscape. In the past, the firm was handling that process the hard way, by offloading SMF data from the mainframe on a daily basis, extracting the required records and fields from that, then using SAS to generate reports about mainframe performance.

The team soon realized, however, that this approach was exceedingly inefficient, so they began looking for a better way of doing the job. The firm selected Precisely Ironstream to help the team process and forward SMF data to the Splunk Enterprise analytics platform. This gave the company a real-time view of system performance, without costly and time-consuming manual intervention.

Security information and event management (SIEM)

A large healthcare organization was struggling with SOC2 certification because the SMF logs from its three IBM mainframe systems were inaccessible to Splunk Enterprise. The organization was using IBM’s zSecure products alongside some in-house utilities to manage security-related tasks and data. This organization’s situation was especially challenging because of the sheer volume of SMF data, which had the potential to grow as high as 800 GB per day.

The healthcare organization created an automated process with Ironstream, forwarding selected SMF records to Splunk to track the information needed for SOC2 certification compliance. This provided real-time visibility and enabled the organization to eliminate the manual processes and related costs associated with using zSecure.

In another case, a federal law enforcement agency was grappling with security compliance audits, which required it to collect and analyze log data from a variety of systems across its IT landscape. The agency had selected Splunk Enterprise for log management, but one important piece of the puzzle was missing. The organization’s IBM mainframe housed some extremely sensitive information, including authentication data, enterprise-wide details on password changes, log-in successes and failures, and account lock-outs, but that information was inaccessible to Splunk.

By implementing Precisely Ironstream, this federal agency gained access to a complete picture of security-related events in real time, giving it all the information needed to comply with security audits.

Ironstream provides the critical missing link between IBM mainframes and Splunk Enterprise for IT analytics.  Precisely has been a Splunk Technology Alliance partner since 2014, and with our Ironstream product, we provide the industry’s premier mainframe log data access solution for Splunk customers.

To learn more about Ironstream and explore real world case studies for ITOA, ITSI, and SIEM where new technologies can provide answers to the questions challenging organizations, read our eBook  Splunk and the Mainframe: 6 Real-World Case Studies for ITOA, ITSI and SIEM.

The post How to Power AI-Driven Analytics with IBM Z and IBM i appeared first on Precisely.

Valence Developer Diaries #018 – The New “local data” feature for grids and edit grids!

In this session Sean and Johnny demonstrated two new features in Valence 6: “local data” just introduced for grids and edit grids and the update to the NAB RPG button program. Please write any suggestions you have for future content in the comments below, and please like and subscribe! 1:38 – Going over the widgets created prior to the session: edit grid and form 2:00 – Edit Grid – overview of the new option under the Data section called “Local” 3:45 – Creating a new NAB app using the form and edit grid widgets 4:55 – Adding a button to app section in NAB Behaviors 5:15 – Setting up a button action as – “Call an RPG Program” 6:10 – Creating orders for existing customers in the NAB App – Locally 8:53 – Reviewing the NAB app frontend and backend 9:40 – Comparison of the ways data was handled previously and currently in NAB with the usage of “Local” 11:00 – Reviewing the RPG, new settings in Valence NAB 14:00 – SetResponse, SetWidget
Verified by MonsterInsights