Pages

Archive and Purge


What is Data Archive and Purge Process?

Data Archive : Data/Case Archiving is the process of moving inactive data from Active storage location into Archive storage location.

Inactive data – The data(cases) is not going to be used actively for certain period of time (business policies defines the criteria ).

Active Storage – The storage where the data is created/updated and actively used by business operations team for business functioning. Normally databases . The Active storage system should be quick enough to support  business operations and  scalable enough to support  storage needs and should support high throughput(I/O) . 

Inactive storage/Archive Storage – Normally on-premises tape and disks are used for archiving purpose . Now cloud storage services are coming with cheaper and large amount of space. 

Purging is the process of deleting inactive data which is not required any more in active or Archive storage depending on the regulatory or performance reasons

Archiving data should be purged from active locations for freeing space and improve the performance

PEGA Archive and Purge Feature


coming soon

Case Management


What is Case Management ?


A case, which represents a business process, is made up of many stages, processes, tasks, policies, and supporting content. The case as a whole continues to change throughout its life cycle due to internal and external events. Depending on the context of the case, individual tasks, processes, or stages can be resolved by different customer service representatives (CSRs). This flexibility helps you achieve your goals in the most effective way.;



Objective/Purpose  of case management :

Accountability
Productivity
efficiency
Consistency
Tracking
Visibility
Overall improved  quality of business management



Request Channel (omni) -  A channel is a messaging service or voice service


  • Email ( offline unformatted text conversation)
  • Phone(manual online Oral -conversation)
  • IVR (Automated Oral -conversation)
  • Chat ( online unformatted text conversation)
  • Web (self-service) - Direct case creation access to end user
  • Mashup -
  • Mobile -  Direct case creation access to end user
  • Integration from other application:  (structured text conversation - both offline and online)
                                   XML.JSON,SWIFT,CAMT,Telex etc ICD's


Case Creation ( Design case) - What is required for case creation/initiate a case?

  • Who is the requestor?
    • Contacts -Parties
  • What is your member ship?
    • Insurance account?
    • Bank Account?
    • What is the transaction?
  • Purpose of the conversation -call/Inquiry ?
    • Which product?
    • What is the issue?
  • Do we need to keep the case open or resolve the case?
  • Do we need to send any acknowledgment of case creation?


While case creation:

  • Validate all mandatory fields
  • Is there any previous inquiry from same requestor on same topic? Duplicate check?
  • Pull the required additional information automatically
    • Contact
    • Account
    • Insurance
    • Transaction


Once the case is created?

  • What is the case number/tracking number?
  • What is the urgency?
  • What tasks(assignment) we need to create?
  • Whom should we assign the task(assignment)? routing
  • What is the time limit to complete the task? SLA
                                   Keep a time limit for each task
  • Do we need any approval to complete the task? Maker and  checker

Integrations:

  •  How can we complete the task?
  • What are the dependencies?
  • Do you need additional information?


Evidences?

  • How to manger additional information received or gathered in the process? Attachments?
  • How to link additional information with case ? Linking


Event Tracking?

  • How do we track all actions/events  performed in the process?
  • What is automatic Process?
  • What is manual process and who did what? History/Audit

Oracle Global Temporary Table

A temporary table is a table that holds data only for the duration of a session or transaction.


CREATE GLOBAL TEMPORARY TABLE xyz( column1,column2 ) ON COMMIT DELETE ROWS;


by executing below statement it automatically deletes the rows.

select column1,column2 from xyz;
commit;

Load Testing Terms

What is 90 percentile ?

suppose the response times are 1, 2, 3, 4, 5, 6, 7, 8, 9 & 10.

9 is the 90 percentile

What is Think time?

The time between the completion of one request and the start of the next request. 


What is PACE Time?

How fast the script submitting server requests?

Throughput= Total No of Transactions/Time in Seconds , Pacing= (Response_Time + Think_Time)
From Your Requirements- Total No of iterations 100 and 1 iteration have 6 transactions, So total no of transactions = 600
Throughput for 1 Minute is: 600/60 = 10 , Throughput for 1 Sec is: 0.16
According to formula 50 = 0.16*(Pacing) Pacing = 312.5 seconds
To achieve 100 Iterations in 1 Hour you have to set pacing 312.5 seconds, Make sure Pacing = Response_time + Think_Time.

What is Correlation  ?
Correlation is the capturing of dynamic values passed from the server to the client and back. We save this captured value into a LoadRunner parameter, and then use this parameter in the script in place of the original value.




Pega and Cloud

Cloud Types/Services :

SaaS - Software as a Service 
PaaS - Platform as a Service
IaaS - Infrastructure as a Service 





Benefits/advantages  of cloud computing.

  • Usage based costs
  • Scalable-On-demand 
  • Quick to market



Deployment Models.

  • Private
  • Public cloud providers
  • Hybrid

Pega Cloud options.

  • Public cloud option
    • Pega Managed cloud: ( PaaS and SaaS options)
    • Customer Managed cloud (IaaS -Amazon AWS, Microsoft Azure, Google Cloud platform) 
    • Partner/vendor  Managed cloud (IaaS -Amazon AWS, Microsoft Azure, Google Cloud platform) 
    • IaaS+ PCF(Pivoatal Cloud Foundry) = (PaaS)


  • Private cloud
    • Customer Managed clouds -
    • CaaS(Container as Service)
      • Docker option

Non-Cloud options.

  • Physical servers for all layers- On-Premise


More to explore or limitations.


Pega Cloud does support only tomcat and PostgreSQL







Pega Frameworks

What is Pega Frameworks ?



  • Generic solution to an Industry problem or requirement. 

  • As it is generic it may not 100% fit into your organization needs but instead of building an application from scratch  you can extend the generic layer and customize as per your organization needs.

  • if the customization need is more than 50%  better to implement from scratch 

  •  if the PEGA implements the customized features in the  later  version in different way  it will be very difficult for upgrades

PEGA Features


What is PEGA PLATFORM

  • PEGA Platform is a Java(J2EE)/WEB based BPM tool which has integrated Development environment and integrated Administrator  Portal.

  • Pega application can be deployed as a war file or EAR file and it is very lightweight ( rebuild of war file never required except some exceptional configuration change scenario).

  • The automic development unit in PEGA is called rule ( Similar to java class instance). 

  • Rule is an instance of a Rule Type/Rule Class,  rule forms used to create rules.

  • Ruleset is deployment unit for a set/group of rules. 

  • Rules schema  contains all the list rules and dependencies  shipped part of pega platform and the rules developed part of your development using the pega platform.


PEGA PLATFORM features 

  • Rapid Applications Development
    • low  code development features( Dev studio and App studio).
    • form based and declarative based development
    • layered Architecture (reuse)
  • Case Management
    • Case/Ticket registration via Omni channel
    • Case sequence
    • Case classification (case types)
    • Case routing/prioritization(IVA)
    • SLA
    • Maker checker 
  • Audit trail 
    • capture user/automated  events in case life cycle
    • summary/detail view of audit
  • Documents support
    • document upload and download for evidence
    • summary of document 
  • Auto Event Management
    • Automatic SLA trigger
    • Alerts
    • escalations
  • Entitlement
    • Role based entitlement
    • Attribute based
  • Integrations-
    • rich set of built in integrations
    • Supports Integration to  latest tools
    • configuration based Integration 
  • Reporting
    • number of reporting options
    • Manger(user) configurable reports
  • Dashboard
    • Variety of Built in widgets
    • user configurable widgets
    • Senior manger views
  • Delegation
    • Business policies delegation options
  • NLP
    • Built in NLP features for unstructured communication

Table Partition's in PEGA


PEGA and Table Partition


  • Table partition is splitting larger tables  into smaller sizes for performance and maintenance purposes based on certain criteria

  • Another benefit of partition is Reclaiming the space by dropping the whole partition. 

  • Please refer below article for more information.


How to choose Partition Criteria?







  

NLP Natural Language Processing

Natural Language Processing is scanning through unstructured data and extract the entities to identify the intent of  the data.

SQL to read from BLOB

In some Production  Scenarios you might need to take a decision on some  Property values which are not exposed .

PEGA provides some functions and oracle java classes . Below is the syntax of the sql to read from blob

SELECT pr_read_from_stream('pzInsKey', pzInsKey,pzPVStream)as"PropName" from pc_work


Please make sure your java classes are valid before you run pr_read_from_stream

Process Mining - BPM

  • Process mining is determination of business  Operational process from event(Audit) logs using software tools
  • The advantage of process mining is to identify the deviations from standard business process and fix those outliers
  • There could be challenges in determining the business  Operational process from event(Audit) logs due to the lack of data/improper data


Heap Dump Analysis - Out Of Memory errors -PEGA

OOM - out of the memory is one of the common error that teams get  while figuring out the optimized jvm and Cache settings 

Here are the some tools will helps to analyze heap dumps.

IBM HeapAnalyzer: 
Eclipse MAT- memory analyzer tool
IBM® Support Assistant - 
(to download IBM Support Assistant, you must be a registered user on the IBM Software Support website)


heap dump formats -

PHD -Portable heap dumps
HPROF binary dumps
System dumps - core dumps



Eclipse MAT- memory analyzer tool 

you can download from the  URL https://www.eclipse.org/mat/downloads.php


you need dtfj plugin to support PHD heap dumps

http://public.dhe.ibm.com/ibmdl/export/pub/software/websphere/runtimes/tools/dtfj/







The leak suspects the where is the problem . In below sample when you drill down you can see which class objects are causing an OOM issue 








How to externalize file attachments in PEGA -S3

How to externalize the attachments in PEGA? 


What is externalization?

By default PEGA stores the file attachments in the database tables

How to store file attachments in external storage system (S3)?


PEGA provides some forms to configure Amazon S3 connectivity and achieve the externalization of attachments in below 3 simple steps

STEP 1: Create Authentication profile

Create-> Security->Authentication Profile














STEP 2: Create a Repository and check the test connectivity
Create->Sysadmin->Repository


STEP 3: configure the storage location to AmazonS3
Open the application instance -> Integration & security tab ->content storage upload section -> select the S3 repository


























this step completes the Configuration

Validate/Test the file location
Pick a case -> Recent-content (attachment)->upload a file























Validate the file in the Amazon S3



Validate the file in the Database /Internal process?
Data-Admin-WorkAttach-File
coming soon...

Potential issues in the configuration

you are not authorized to create, modify or lock DATA-REPOSITORY











Please follow below steps to overcome this issue 





Additional points to consider :

  • Make sure the connections are secure and followed the organizational security polices
  • Make sure connection exceptions scenarios.
  • What happens if S3 goes down? ( it does in internal S3 scenario)

Limitation/More to explore

  • Currently it doesn't supports to configure On-premises S3 storage ( you need to customize)
  • how to extend for  other types attachments.
  • how to support multiple buckets in single application.

Do not tamper user Input elements

If you are not able to track what values are users inputted, application users might claim that they were selected the correct value but the application is behaving wrong. 

Application can utilize the values of user inputs in business logic but shouldn't tamper the values 

Application can allow users to edit their inputs later point of time but make sure you capture those changes for audit purposes 









ARCHIVE

Archiving is the process of moving inactive data to cheaper storage system and stored for longer period of time for audit purpose.

CICD AND DEVOPS

CICD AND DEVOPS

PEGA CUSTOMER SERVICE

PEGA CUSTOMER  SERVICE 

PEGA SMART INVESTIGATE

PEGA SMART INVESTIGATE

PEGA PLATFORM

PEGA PLATFORM

Reclaim unused Space




  • As part of archive process we delete the rows from the tables from the primary  storage once the data is moved to archive storage

  • The delete Process leaves the fragments in the tables and as the tables grows .

  • oracle provides many options to reclaim the space /defragment 
    • Redefine
    • Shrink
     but any of these features requires ample amount of data base green zones which is not viable for high availability platforms.

what are the alternatives? 

How to optimize or improve your customer support response time

How to Optimize or Improve  your customer response times ?

Response time is one of the client experience attribute

How do track your Customer tickets?

Do you use any Customer service software tools ( automated) or 
do you use manual process (paper based,outlook,gmail etc)

How does your customers interact with Support teams?

Phone OR chat OR IVR OR email ?

Hopefully 90-99% phone/Chat/IVR inquiries are resolved over the phone only.

email inquiries are challenging 

Can you make a report of current response times  and what is the quality of that data?

if you are using automated tools and you have a current response times ?
  • Document the end to end flow?
    • can you document the end to end process flow?
    • are you not able to document the E2E flow due to multiple paths
    • can you trace /  
    • Standardize your process- the document
  • Identify if there is a requirement to change business Process?
  • Identify any steps can be eliminated or automated with technology?
  • does the client support team is using the technology p





Security - Audit for operator ID changes and pytracksecuritychanges

Operator audit is so critical to to prevent  any  frauds in the organization .

  • Operator ID instances are Data instances in Pega.
  • Pega provides field level tracking feature to support such audit .
  • PyTracksecuritychnages DT (Data Transform) in the class for which you want to track the fields  and configures required fields
  • TrackSecurityChanges  Trigger(save as from OOTB)  in the class for which you want to track the fields 




How to debug AWS S3 connection handshake

S3 Connection sample  Java App2jva




log4j  settings
Debug log


log4j:ERROR Could not find value for key log4j.appender.file
log4j:ERROR Could not instantiate appender named "file".
2020-04-15 22:55:11,158 [main] DEBUG com.amazonaws.AmazonWebServiceClient -  Internal logging successfully configured to commons logger: true
2020-04-15 22:55:11,263 [main] DEBUG com.amazonaws.metrics.AwsSdkMetrics -  Admin mbean registered under com.amazonaws.management:type=AwsSdkMetrics
* srinivas Gowni
2020-04-15 22:55:12,053 [main] DEBUG com.amazonaws.request -  Sending Request: GET https://s3.us-west-2.amazonaws.com / Headers: (User-Agent: aws-sdk-java/1.11.267 Windows_10/10.0 Java_HotSpot(TM)_64-Bit_Server_VM/13.0.2+8 java/13.0.2, amz-sdk-invocation-id: b0ab5b4e-d3c5-0886-5761-85646d8b8433, Content-Type: application/octet-stream, ) 
2020-04-15 22:55:12,112 [main] DEBUG com.amazonaws.auth.AWS4Signer -  AWS4 Canonical Request: '"GET
/

amz-sdk-invocation-id:b0ab5b4e-d3c5-0886-5761-85646d8b8433
amz-sdk-retry:0/0/500
content-type:application/octet-stream
host:s3.us-west-2.amazonaws.com
user-agent:aws-sdk-java/1.11.267 Windows_10/10.0 Java_HotSpot(TM)_64-Bit_Server_VM/13.0.2+8 java/13.0.2
x-amz-content-sha256:UNSIGNED-PAYLOAD
x-amz-date:20200416T025512Z

amz-sdk-invocation-id;amz-sdk-retry;content-type;host;user-agent;x-amz-content-sha256;x-amz-date
UNSIGNED-PAYLOAD"
2020-04-15 22:55:12,112 [main] DEBUG com.amazonaws.auth.AWS4Signer -  AWS4 String to Sign: '"AWS4-HMAC-SHA256
20200416T025512Z
20200416/us-west-2/s3/aws4_request
83befc2f1a4732c9f1f61afc66a9c0b5bd2c077d34a196a4e10a1c35b6fad589"
2020-04-15 22:55:12,112 [main] DEBUG com.amazonaws.auth.AWS4Signer -  Generating a new signing key as the signing key not available in the cache for the date 1586995200000
2020-04-15 22:55:12,139 [main] DEBUG org.apache.http.client.protocol.RequestAddCookies -  CookieSpec selected: default
2020-04-15 22:55:12,161 [main] DEBUG org.apache.http.client.protocol.RequestAuthCache -  Auth cache not set in the context
2020-04-15 22:55:12,162 [main] DEBUG org.apache.http.impl.conn.PoolingHttpClientConnectionManager -  Connection request: [route: {s}->https://s3.us-west-2.amazonaws.com:443][total kept alive: 0; route allocated: 0 of 50; total allocated: 0 of 50]
2020-04-15 22:55:12,202 [main] DEBUG org.apache.http.impl.conn.PoolingHttpClientConnectionManager -  Connection leased: [id: 0][route: {s}->https://s3.us-west-2.amazonaws.com:443][total kept alive: 0; route allocated: 1 of 50; total allocated: 1 of 50]
2020-04-15 22:55:12,205 [main] DEBUG org.apache.http.impl.execchain.MainClientExec -  Opening connection {s}->https://s3.us-west-2.amazonaws.com:443
2020-04-15 22:55:12,372 [main] DEBUG org.apache.http.impl.conn.DefaultHttpClientConnectionOperator -  Connecting to s3.us-west-2.amazonaws.com/52.218.246.72:443
2020-04-15 22:55:12,372 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -  connecting to s3.us-west-2.amazonaws.com/52.218.246.72:443
2020-04-15 22:55:12,372 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -  Connecting socket to s3.us-west-2.amazonaws.com/52.218.246.72:443 with timeout 10000
2020-04-15 22:55:12,534 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -  Enabled protocols: [TLSv1.3, TLSv1.2, TLSv1.1, TLSv1]
2020-04-15 22:55:12,535 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -  Enabled cipher suites:[TLS_AES_256_GCM_SHA384, TLS_AES_128_GCM_SHA256, TLS_CHACHA20_POLY1305_SHA256, TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256, TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_DHE_RSA_WITH_AES_256_GCM_SHA384, TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256, TLS_DHE_DSS_WITH_AES_256_GCM_SHA384, TLS_DHE_RSA_WITH_AES_128_GCM_SHA256, TLS_DHE_DSS_WITH_AES_128_GCM_SHA256, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256, TLS_DHE_RSA_WITH_AES_256_CBC_SHA256, TLS_DHE_DSS_WITH_AES_256_CBC_SHA256, TLS_DHE_RSA_WITH_AES_128_CBC_SHA256, TLS_DHE_DSS_WITH_AES_128_CBC_SHA256, TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_ECDSA_WITH_AES_256_CBC_SHA384, TLS_ECDH_RSA_WITH_AES_256_CBC_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_CBC_SHA256, TLS_ECDH_RSA_WITH_AES_128_CBC_SHA256, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA, TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_DSS_WITH_AES_256_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_DSS_WITH_AES_128_CBC_SHA, TLS_ECDH_ECDSA_WITH_AES_256_CBC_SHA, TLS_ECDH_RSA_WITH_AES_256_CBC_SHA, TLS_ECDH_ECDSA_WITH_AES_128_CBC_SHA, TLS_ECDH_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_GCM_SHA384, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_256_CBC_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA256, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_EMPTY_RENEGOTIATION_INFO_SCSV]
2020-04-15 22:55:12,535 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -  socket.getSupportedProtocols(): [TLSv1.3, TLSv1.2, TLSv1.1, TLSv1, SSLv3, SSLv2Hello], socket.getEnabledProtocols(): [TLSv1.3, TLSv1.2, TLSv1.1, TLSv1]
2020-04-15 22:55:12,536 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -  TLS protocol enabled for SSL handshake: [TLSv1.2, TLSv1.1, TLSv1, TLSv1.3]
2020-04-15 22:55:12,536 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -  Starting handshake
2020-04-15 22:55:12,887 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -  Secure session established
2020-04-15 22:55:12,887 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -   negotiated protocol: TLSv1.2
2020-04-15 22:55:12,887 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -   negotiated cipher suite: TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
2020-04-15 22:55:12,887 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -   peer principal: CN=*.s3-us-west-2.amazonaws.com, O="Amazon.com, Inc.", L=Seattle, ST=Washington, C=US
2020-04-15 22:55:12,887 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -   peer alternative names: [s3-us-west-2.amazonaws.com, *.s3-us-west-2.amazonaws.com, s3.us-west-2.amazonaws.com, *.s3.us-west-2.amazonaws.com, s3.dualstack.us-west-2.amazonaws.com, *.s3.dualstack.us-west-2.amazonaws.com, *.s3.amazonaws.com, *.s3-control.us-west-2.amazonaws.com, s3-control.us-west-2.amazonaws.com, *.s3-control.dualstack.us-west-2.amazonaws.com, s3-control.dualstack.us-west-2.amazonaws.com, *.s3-accesspoint.us-west-2.amazonaws.com, *.s3-accesspoint.dualstack.us-west-2.amazonaws.com]
2020-04-15 22:55:12,887 [main] DEBUG com.amazonaws.http.conn.ssl.SdkTLSSocketFactory -   issuer principal: CN=DigiCert Baltimore CA-2 G2, OU=www.digicert.com, O=DigiCert Inc, C=US
2020-04-15 22:55:12,893 [main] DEBUG com.amazonaws.internal.SdkSSLSocket -  created: s3.us-west-2.amazonaws.com/52.218.246.72:443
2020-04-15 22:55:12,894 [main] DEBUG org.apache.http.impl.conn.DefaultHttpClientConnectionOperator -  Connection established 10.0.0.15:52944<->52.218.246.72:443
2020-04-15 22:55:12,894 [main] DEBUG org.apache.http.impl.conn.DefaultManagedHttpClientConnection -  http-outgoing-0: set socket timeout to 50000
2020-04-15 22:55:12,894 [main] DEBUG org.apache.http.impl.execchain.MainClientExec -  Executing request GET / HTTP/1.1
2020-04-15 22:55:12,894 [main] DEBUG org.apache.http.impl.execchain.MainClientExec -  Proxy auth state: UNCHALLENGED
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> GET / HTTP/1.1
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> Host: s3.us-west-2.amazonaws.com
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> x-amz-content-sha256: UNSIGNED-PAYLOAD
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> Authorization: AWS4-HMAC-SHA256 Credential=AKIA3W6II5PIX4A3QVKE/20200416/us-west-2/s3/aws4_request, SignedHeaders=amz-sdk-invocation-id;amz-sdk-retry;content-type;host;user-agent;x-amz-content-sha256;x-amz-date, Signature=c0aacfb323499fbc957d0ab30b123ee054d4fdbdfd3e59e1848f31b41fdd9b59
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> X-Amz-Date: 20200416T025512Z
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> User-Agent: aws-sdk-java/1.11.267 Windows_10/10.0 Java_HotSpot(TM)_64-Bit_Server_VM/13.0.2+8 java/13.0.2
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> amz-sdk-invocation-id: b0ab5b4e-d3c5-0886-5761-85646d8b8433
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> amz-sdk-retry: 0/0/500
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> Content-Type: application/octet-stream
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> Content-Length: 0
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.headers -  http-outgoing-0 >> Connection: Keep-Alive
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "GET / HTTP/1.1[\r][\n]"
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "Host: s3.us-west-2.amazonaws.com[\r][\n]"
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "x-amz-content-sha256: UNSIGNED-PAYLOAD[\r][\n]"
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "Authorization: AWS4-HMAC-SHA256 Credential=AKIA3W6II5PIX4A3QVKE/20200416/us-west-2/s3/aws4_request, SignedHeaders=amz-sdk-invocation-id;amz-sdk-retry;content-type;host;user-agent;x-amz-content-sha256;x-amz-date, Signature=c0aacfb323499fbc957d0ab30b123ee054d4fdbdfd3e59e1848f31b41fdd9b59[\r][\n]"
2020-04-15 22:55:12,897 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "X-Amz-Date: 20200416T025512Z[\r][\n]"
2020-04-15 22:55:12,898 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "User-Agent: aws-sdk-java/1.11.267 Windows_10/10.0 Java_HotSpot(TM)_64-Bit_Server_VM/13.0.2+8 java/13.0.2[\r][\n]"
2020-04-15 22:55:12,898 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "amz-sdk-invocation-id: b0ab5b4e-d3c5-0886-5761-85646d8b8433[\r][\n]"
2020-04-15 22:55:12,898 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "amz-sdk-retry: 0/0/500[\r][\n]"
2020-04-15 22:55:12,898 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "Content-Type: application/octet-stream[\r][\n]"
2020-04-15 22:55:12,898 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "Content-Length: 0[\r][\n]"
2020-04-15 22:55:12,898 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "Connection: Keep-Alive[\r][\n]"
2020-04-15 22:55:12,898 [main] DEBUG org.apache.http.wire -  http-outgoing-0 >> "[\r][\n]"
2020-04-15 22:55:13,112 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "HTTP/1.1 200 OK[\r][\n]"
2020-04-15 22:55:13,112 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "x-amz-id-2: iEOvH312HGna74kM+w1k4fy/zhemEphyPhzIwE0EDreNyxVXLyvR9yimqjhsxGm/9P/N1f7badc=[\r][\n]"
2020-04-15 22:55:13,113 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "x-amz-request-id: C9BDF89507649C5C[\r][\n]"
2020-04-15 22:55:13,113 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "Date: Thu, 16 Apr 2020 02:55:14 GMT[\r][\n]"
2020-04-15 22:55:13,113 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "Content-Type: application/xml[\r][\n]"
2020-04-15 22:55:13,113 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "Transfer-Encoding: chunked[\r][\n]"
2020-04-15 22:55:13,113 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "Server: AmazonS3[\r][\n]"
2020-04-15 22:55:13,113 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "[\r][\n]"
2020-04-15 22:55:13,117 [main] DEBUG org.apache.http.headers -  http-outgoing-0 << HTTP/1.1 200 OK
2020-04-15 22:55:13,117 [main] DEBUG org.apache.http.headers -  http-outgoing-0 << x-amz-id-2: iEOvH312HGna74kM+w1k4fy/zhemEphyPhzIwE0EDreNyxVXLyvR9yimqjhsxGm/9P/N1f7badc=
2020-04-15 22:55:13,117 [main] DEBUG org.apache.http.headers -  http-outgoing-0 << x-amz-request-id: C9BDF89507649C5C
2020-04-15 22:55:13,117 [main] DEBUG org.apache.http.headers -  http-outgoing-0 << Date: Thu, 16 Apr 2020 02:55:14 GMT
2020-04-15 22:55:13,117 [main] DEBUG org.apache.http.headers -  http-outgoing-0 << Content-Type: application/xml
2020-04-15 22:55:13,117 [main] DEBUG org.apache.http.headers -  http-outgoing-0 << Transfer-Encoding: chunked
2020-04-15 22:55:13,117 [main] DEBUG org.apache.http.headers -  http-outgoing-0 << Server: AmazonS3
2020-04-15 22:55:13,125 [main] DEBUG org.apache.http.impl.execchain.MainClientExec -  Connection can be kept alive for 60000 MILLISECONDS
2020-04-15 22:55:13,188 [main] DEBUG com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser -  Sanitizing XML document destined for handler class com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser$ListAllMyBucketsHandler
2020-04-15 22:55:13,189 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "17d[\r][\n]"
2020-04-15 22:55:13,189 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "<?xml version="1.0" encoding="UTF-8"?>[\n]"
2020-04-15 22:55:13,190 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "<ListAllMyBucketsResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/"><Owner><ID>5e4e0e69f6bd199d4ba035a34bdb7d7a5a4f62092083e55d5f31878a7ef57b1e</ID><DisplayName>srinug13</DisplayName></Owner><Buckets><Bucket><Name>srinipegaattachments</Name><CreationDate>2020-03-04T22:11:56.000Z</CreationDate></Bucket></Buckets></ListAllMyBucketsResult>[\r][\n]"
2020-04-15 22:55:13,190 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "0[\r][\n]"
2020-04-15 22:55:13,190 [main] DEBUG org.apache.http.wire -  http-outgoing-0 << "[\r][\n]"
2020-04-15 22:55:13,191 [main] DEBUG org.apache.http.impl.conn.PoolingHttpClientConnectionManager -  Connection [id: 0][route: {s}->https://s3.us-west-2.amazonaws.com:443] can be kept alive for 60.0 seconds
2020-04-15 22:55:13,192 [main] DEBUG org.apache.http.impl.conn.PoolingHttpClientConnectionManager -  Connection released: [id: 0][route: {s}->https://s3.us-west-2.amazonaws.com:443][total kept alive: 1; route allocated: 1 of 50; total allocated: 1 of 50]
2020-04-15 22:55:13,192 [main] DEBUG com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser -  Parsing XML response document with handler: class com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser$ListAllMyBucketsHandler
2020-04-15 22:55:13,203 [main] DEBUG com.amazonaws.request -  Received successful response: 200, AWS Request ID: C9BDF89507649C5C
2020-04-15 22:55:13,203 [main] DEBUG com.amazonaws.requestId -  x-amzn-RequestId: not available
2020-04-15 22:55:13,203 [main] DEBUG com.amazonaws.requestId -  AWS Request ID: C9BDF89507649C5C
Your Amazon S3 buckets are:
* srinipegaattachments