Quantcast
Channel: Microsoft Dynamics 365 Community
Viewing all 50850 articles
Browse latest View live

AI Leveling the Playing Field with Microsoft Dynamics 365

$
0
0

Competing in the Digital Age with Dynamics 365

Many small businesses disregard AI, believing it to be another tool exclusively useful and only available to larger organizations. This couldn't be further from the truth.

AI is actually easier for smaller businesses to implement, as they are still growing and developing their systems. They have the opportunity to incorporate AI into the fabric of their business processes from the beginning.

AI can also be extremely useful, handling data and providing insights that most small businesses would not have the staffing to get otherwise, and providing access to the same smart decision-making tools as larger corporations at an affordable cost.

Explore the ways in which technology is once again shifting the way we do business and the opportunities available to small businesses with increased access to affordable AI and cloud service technologies. Download the "AI Leveling the Playing Field with Dynamics 365" eBook to learn more.

by TrinSoft, LLC a Microsoft Dynamics Enterprise Partner in Kentucky


From the Microsoft Dynamics 365 Business Central and NAV Blogs: Payment registration; Nonstock items; Moving a database; Direct printing

$
0
0

A selection of the latest insights from the Microsoft Dynamics 365 Business Central-NAV blogs

Microsoft Dynamics AX to D365 Upgrade Journeys, Part 3: Understanding the cost

FLIC - The little button that could change your world

$
0
0

At Microsoft Dynamics 365 Saturday in Boston Jerry Weinstock shared with us how he is using FLIC with Microsoft Dynamics 365. 

The idea is that you setup a FLIC button (which is a physical button) and tie it to a Microsoft Flow. The flow queries the Microsoft Dynamics 365 database, puts together a table of current information (say the top 10 opportunities) and sends the table in an e-mail to a manager. Managers who work heavily in e-mail, but who don't have time for applications might find this little configuration a true gift. 

How Many Trees Have These 4 Microsoft Dynamics GP Users Saved?

$
0
0

According to my research, “1 tree makes 16.67 reams of copy paper.” That means that using Microsoft Dynamics GP has helped these four companies, over the years, save…..a bunch of trees. And every tree saved is a win, right? Perhaps more impactful, saving all of this paper also helped these companies save a tremendous amount of time (and file box space).

Church Homes: Stopped Printing 75,000 Pages a Year

“At one point on our AS400 system, between GL activity Fixed Asset and Payroll reports, I calculated that we printed 75,000 pages annually.  Now we don’t print anything, we just save to PDF in Dynamics GP. Plus, we no longer need to look in physical binders to get historical data. Now the reports are searchable and everybody can access them.”

IMP: Online SSRS Portal Saves 3 Days and 500 Sheets of Paper Monthly

IMP needed a way for its 35 outside sales reps to see their commission data in real time. In the past, checks were mailed out monthly with a consolidation report to show their commission calculation and a second report of new invoices in the system. It took one IMP employee three days each month to print, sort, staple, stuff, and mail the 500+ pages.

Rich Larkin, an experienced Microsoft Dynamics GP user, decided to build an online portal using Microsoft SQL Server Reporting Services (SSRS). The sales reps now receive their checks electronically via ACH transfer, then securely log into the online site to see how their checks were calculated. “Microsoft Dynamics GP is so flexible. The SSRS report was up and running in about a week,” says Larkin.

Beekley: Paperless Process Creates Stress Free Environment

“We’ve taken our process green by implementing a paperless process using the PaperSave add-on product.  This has created a stress-free environment for our associates and has allowed them to participate in continuous training for providing world-class customer care to our customers.”

 Russell Sage Foundation: Integration Saves a Lot of Time and a Lot of Paper

“In the past, one of the accountants would print out all of the payments that we’d made, hand write the check numbers, and then walk it over to me in the grants management office. It took 2 days each month to enter and re-enter the payments in both systems. Now it takes just five minutes.  I also used to spend so much time printing out documents for the finance team, and now they can log in to the system and view them. It saves us a lot of time and a lot of paper. The import/export also reduces the risk of manual error.”

How many trees does your company use every year? Could changing your business processes make an impact?

CAL Business Solutions can review your current ERP system and make recommendations. Contact us today to start the conversation.

By CAL Business Solutions, Microsoft Dynamics GP and Acumatica Partner, www.calszone.com

Activating or deactivating somebody else’s workflow when you are not a System Admin

$
0
0
We were investigating the option of using non-interactive user account for automated deployment, and it was all going great until we ran into the error below while importing the solution: “This process...(read more)

A Robotic Future: RPA Trend Predictions

$
0
0

Just when you think technology can't advance any further, we find a robot that does our dishes, a refrigerator that tells us how much milk is left and a thermostat that automatically adjusts based on the time of day or your phone's proximity to your home. Artificial intelligence (AI) is making it possible for technology to continue to develop, especially in the realm of robotic process automation (RPA).

According to a report by Grand View Research entitled “Robotic Process Automation Market Size, Share & Trends Analysis Report By Services (Professional Services, Training Services), By Organization, By Application, By Region, And Segment Forecasts, 2018-2024,” in 2016 the global RPA market size was valued at $199.1 million and is growing at a compound annual growth rate (CAGR) of 60.5%. This means that the demand for automation by businesses is growing at a rapid rate and is expected to continue to rise over the forecasted years. As the demand increases, the technology will need to continue to be honed to meet the needs of the modern business and continue to provide value. The report also states that:

“The role of technology is evolving continuously at a faster pace. The last few decades witnessed various waves of technology progression that significantly impacted business growth. Few of these technologies are now declining as businesses globally are transforming into a dynamic digital environment."

Looking into the future, it’s anticipated that RPA will continue to be in high demand and that the CAGR will soar in the coming years as the future becomes more and more robotic and automated. Because it eliminates human intervention of manual processes with automated workflow, modern enterprises are jumping on the chance to implement, execute and scale RPA technology to meet their unique business needs. Additionally, software developers and RPA experts are continuing to hone RPA technology in order to meet the demand of modern, technologically-savvy businesses.

While small- and mid-sized enterprises have more limitations and are typically not in the market for an extremely robust solution, RPA is becoming more commonplace and these companies are jumping on the RPA bandwagon. In fact, we're expecting to see a CAGR of 62.3% between 2018 and 2024, while large enterprises are expected to register a CAGR of 59.5% over the same period. Industry experts are in agreement that RPA will become more common in even small- and mid-sized organizations. MetaViewer’s VP of Marketing and Sales, Nick Sprau, talked with MSDynamics World about the benefits of RPA for enterprises of all sizes, stating, “RPA has a reputation for being the ideal solution for large, enterprise-level firms, but we’ve found that many small- and mid-sized businesses have also benefitted from automation. The pros for small businesses adopting RPA are the same as any large or enterprise-level organization – time and cost savings, increased efficiency, enhanced visibility and greater accuracy.”

As its market value and compound annual growth rate continues to be not just consistently strong, but rapidly growing, we are excited to see more and more companies – both small and large – adopting automation technology. Robots and automation are trending, and we can safely say that they are not going anywhere anytime soon, but instead will advance in their functionality and their ability to bring more companies into a paperless future.

When deploying your data estate to Azure, automation is the key

$
0
0

Time-to-data matters when migrating a data warehouse, and using automation for Azure might be the key


When deploying your data estate to Azure, automation is the key

$
0
0

Time-to-data matters when migrating a data warehouse, and using automation for Azure might be the key

When deploying your data estate to Azure, automation is the key

$
0
0
Time-to-data matters when migrating a data warehouse, and using automation for Azure might be the key ...read more

When deploying your data estate to Azure, automation is the key

$
0
0

Time-to-data matters when migrating a data warehouse, and using automation for Azure might be the key

D365 Quick Tip: Bulk Clear field values

$
0
0
A very handy approach for admins and users of Dynamics 365 who want to bulk clean up field values – And just selecting them in Bulk Editing them as shown below doesn’t help!! So here’s...(read more)

Fixed – Copy Paste not working in Remote Desktop Connection – Windows 10

$
0
0
Copy and Paste which was working as usual suddenly stopped working one fine day. Had the following setting enabled Restarted the following process on the remote machine, but that also didn’t help...(read more)

Using gulp plugins to transform files

$
0
0

In the last post, I explained how to use tasks, but I didn’t explain yet how to make them useful. Tasks are most useful when they automate certain file operations, and gulp helps with reading and writing files. However, if you need to perform a transformation, then you need to look for plugins that can get that work done for you.

In this post you’ll learn how to set up and use various plugins that can automate some common file operations.

As gulp documentation says: Gulp plugins are Node Transform Streams that encapsulate common behavior to transform files in a pipeline – often placed between src() and dest() using the .pipe() method.

Let’s imagine that:

  • You are creating a control add-in and you are writing a lot of JavaScript
  • You want to isolate your JavaScript features into individual files, one file per feature, and that you want to combine these files all into one file that will be included in your control add-in
  • You want to write modern JavaScript, but still want to make sure all browsers will be able to run it (remember, anybody using earlier versions of Windows than Windows 10, and using the NAV/BC universal client, will be running Internet Explorer in there)
  • You want to minify the resulting file to consume space
  • You want to zip the contents of your control add-in into the Resource.zip file

All valid assumptions, correct? If you are anything like me, this is what you do every day.

Now, some of these things can be done manually, some not really, but all of them certainly can be automated using gulp. Our job would need to contain the following transformations:

  1. Bundle JavaScript files into a single file
  2. Transpile that file into a lower version of JavaScript
  3. Minify that file
  4. Package the contents of the resource directory to produce the Resource.zip file

The more astute of you will immediately notice that all of this cannot be one task. Unless you want to complicate your life, one task should only perform those operations that can be handled with a single pipeline. A single pipeline is what starts with a stream of input files and transforms them until you cannot transform it. This means that as long as the output of the previous operation can be the input into the next operation, you can pipe that output down the pipeline. Packaging the contents of the resource directory is an operation that cannot use the output of the previous operation, so you need to isolate it into a separate task.

Another reason why you may want to separate operations into individual tasks is when an operation makes sense on its own. If you can imagine any of the previous operations as something you’d ever want to run individually, then that operation should also be isolated into a separate task. Tasks can be combined in different ways, and I’ll address that as well in one of the future posts, so don’t be afraid to split up operations that can benefit from splitting.

In my example, packaging the Resource.zip file is an operation that can be done independently of everything else. JavaScript is not the only content of the resource file: you may change CSS, you may add or modify images, configure the manifest, you name it. In all these situations, you may want to package the resource file independent of the JavaScript operation. It just makes sense to turn it into a task of its own.

Good, so we have two tasks then, with the following flow:

  1. Preparing JavaScript file
    1. Collect all JavaScript files into a single stream
    2. Pipe that into a function that combines all the files into one
    3. Pipe that into a function that transpiles your code to a lower version
    4. Pipe that into a function that minifies your code
    5. Store the result into a destination inside the resource subdirectory
  2. Zip the content of your resource subdirectory
    1. Collect all files underneath the resource subdirectory into a single stream
    2. Pipe that into a function that zips the content of the stream
    3. Store the result into a destination directory

Good. Now that we know what steps we need to do, let’s create the two tasks with minimum operations that gulp can do on its own:

const gulp = require("gulp");

function javaScript() {
    return gulp
        .src("src/*.js")
        .pipe(gulp.dest("resource/Script"));
}

function resourceZip() {
    return gulp
        .src("resource/**")
        .pipe(gulp.dest("build"));
}

So far, so good.

Let’s focus now on JavaScript task first. The first operation we need to do on the files is to bundle (or concatenate) them. If bundling the files was the only operation you need, then you wouldn’t really need a plugin as Node.js contains all the necessary APIs for that. However, if your bunding is just a pass-through operation in a pipeline, you’ll need a plugin. Let’s look for a plugin.

Normally, you’d do a general google search, but gulp maintains its own plugin directory which makes your life much easier. Go to https://gulpjs.com/plugins/ and search for “concatenate”, you’ll find a number of plugins. For my example, I chose gulp-concat. When I need a plugin, I look in its documentation to see whether it can do everything I need, and how simple it is to use. I also look at its github repository (all of plugins I used have a github repository) where I can see how much “alive” it is: how many commits and how often, are there pull requests merged into it, how many forks are out there, is the author responding to issues, etc. All of it can contribute to a perceived reliability rating that finally makes me choose one plugin over another.

Good. Now that we know that gulp-concat plugin can do the job for you, how do you use it?

If you remember the first post in the series, you can remember how you imported gulp itself into your workspace: by using npm. Gulp plugins are typically package, and to import them into your workspace, you’ll also use npm. Just remember, any gulp plugin packages that you import are dependencies in your development environment, therefore you must use the –save-dev option. Let’s import gulp-concat:

npm i gulp-concat --save-dev

Now that gulp-concat is a part of your solution and under the management of npm, you can use it inside your gulpfile.js script:

const concat = require("gulp-concat");

Now, we can use the concat in the pipeline like this:

.pipe(concat("controlAddIn.js"))

The next step is to transpile that using babel. Again, a little search should help you find the gulp-babel plugin that you can then first install into your workspace. However, if you simply use this, you may realize that it does not work.

npm i gulp-babel --save-dev

Some plugins use other packages that they don’t bundle directly, but allow you to choose specific versions of those packages, and sometimes they will provide different install scripts. To install gulp-babel with the necessary dependencies, you should use this:

npm i --save-dev gulp-babel @babel/core @babel/preset-env

… then declare inside the gulpfile.js script:

const babel = require("gulp-babel");

… and finally pipe the concatenated file into it:

.pipe(babel({ presets: ["@babel/env"] }))

Good job! Finally, we are ready for the last step of the first task – the minification. A quick search should reveal any number of possibilities, but I’ll go with this one:

npm i gulp-uglify --save-dev

When it’s installed, declare it:

const uglify = require("gulp-uglify");

… and then pipe the transpilation results into it:

.pipe(uglify())

And you are finished. If you did everything correctly, this is now your first task:

function javaScript() {
    return gulp
        .src("src/*.js")
        .pipe(concat("controlAddIn.js"))
        .pipe(babel({ presets: ["@babel/env"] }))
        .pipe(uglify())
        .pipe(gulp.dest("resource/Script"));
}

For the second task we need to locate a plugin that can zip a stream of files. Again, search the gulp plugin catalog, and you should discover gulp-zip. You are an expert by now, so you know that you first need to install it:

npm i gulp-zip --save-dev

… then declare it:

const zip = require("gulp-zip");

… and finally use it in your zip pipeline:

.pipe(zip("Resource.zip"))

If you did everything correctly, this is the second task:

function resourceZip() {
    return gulp
        .src("resource/**")
        .pipe(zip("Resource.zip"))
        .pipe(gulp.dest("build"));
}

Perfect. The only thing that’s missing is exporting the tasks from your gulpfile.js:

module.exports.javaScript = javaScript;
module.exports.resourceZip = resourceZip;

You can try to see if these two tasks now work:

gulp javaScript

Take a look inside your resource/Script directory and you should fine the controlAddIn.js script in there.

Then, run this:

gulp resourceZip

Now take a look inside the build directory and you should fine the Resource.zip file in there.

Cool, but now we need another task that first invokes the javaScript task and then the resourceZip task in sequence. Luckily, gulp helps with that, too. The gulp.series() method creates a task that runs specified tasks in a serial sequence, one after another. Just export one more task from your gulpfile.js:

module.exports.build = gulp.series(javascript, resourceZip);

If you now delete the files that previous two tests created, and then run this:

gulp build

… you will see that it has correctly built your Resource.zip file.

Just in case, here’s my latest state of the gulpfile.js:

const gulp = require("gulp");
const concat = require("gulp-concat");
const babel = require("gulp-babel");
const uglify = require("gulp-uglify");
const zip = require("gulp-zip");

function javaScript() {
    return gulp
        .src("src/*.js")
        .pipe(concat("controlAddIn.js"))
        .pipe(babel({ presets: ["@babel/env"] }))
        .pipe(uglify())
        .pipe(gulp.dest("resource/Script"));
}

function resourceZip() {
    return gulp
        .src("resource/**")
        .pipe(zip("Resource.zip"))
        .pipe(gulp.dest("build"));
}

module.exports.javaScript = javaScript;
module.exports.resourceZip = resourceZip;
module.exports.build = gulp.series(javaScript, resourceZip);

In my next post, I’ll cover two more topics: deploying the control add-in to an NAV / Business Central instance using PowerShell from a gulp task, and passing configuration to gulp tasks from JSON configuration files.


Read this post at its original location at http://vjeko.com/using-gulp-plugins-to-transform-files/, or visit the original blog at http://vjeko.com. 5e33c5f6cb90c441bd1f23d5b9eeca34

The post Using gulp plugins to transform files appeared first on Vjeko.com.

DynamicsPerf 2.10 Deployment Guide

$
0
0

INTRODUCTION

The tool DynamicsPerf (Performance Analyzer for Dynamics) is used by Microsoft Dynamics Support team, Premier Field Engineers and Product Group team members to diagnose performance issues with Dynamics products such as Dynamics 365 Local Business Data (On Prem), Dynamics AX and Dynamics CRM.

IMPORTANT: The purpose of the Performance Analyzer is to be used on a continual basis so it is important for administrators to understand the components that make up the tool to ensure all jobs and collectors are running.

DynamicsPerf collects a variety of pertinent information from the database server, Application Object Server (AOS), and application server. This information is captured from a number of collectors provided by DynamicsPerf that includes query statistics, query plans, index statistics, database and AOS server configurations, AOS event logs and Application Object Tree (AOT) metadata. In addition, blocking and deadlocking events are collected through SQL Exenteded Events while performance counter data is collected from the database and AOS servers.

Deployment Models

As you can see from the illustration above, the DynamicsPerf database is the central repository for most of the data collected by the Performance Analyzer tool.

Performance Analyzer is delivered as two SQL Server solution files:

  1. [Performance Analyzer 2.10 Installation.ssmssln]; the first includes a set of SQL jobs, X++ classes, Visual Basic scripts and performance counters to initiate the collection process.
  2. [Performance Analyzer 2.10 Analyze Data.ssmssln]: the second includes a set of sample SQL scripts that can be used to query and analyze the populated tables and views in the DynamicsPerf database.

Before we begin, you can read about DynamicsPerf at:

Deploying DynamicsPerf

There are several steps that need to be completed in order to successfully deploy DynamicsPerf. The toolset is meant to be deployed and set up for data collection on a continual basis throughout the life of your Microsoft Dynamics produts. This ensures that if performance issues arise, you are able to quickly identify the bottleneck as well as use for comparison purposes. We currently have scripts for Dynamics AX and Dynamics CRM in version 2.10.

Deployment Setup Checklist

The following is a summarized checklist of the steps to deploy Performance Analyzer. See the steps below for detailed information.

Step 1

Run Script 1 Create Core Objects, this creates the DynamicsPerf database and all objects

Step 2

Run script 2 Deploy DynamicsPerf Schedules (pick your desired collection timeframe)

Step 3

If DynamicsPerf is being installed remotely run script 3 Setup Linked Servers, otherwise skip this step

Step 4

Run Script 4 Configure DBs to Collect, specify your Dynamics Databases that you want to collect data for

Step 5

Run Script 5 Setup SSRS Data Collection, this will allow collection of SSRS Report performance data

Step 6

Run Script 6 Install Fulltext Indexes in DynamicsPerf database, must have FULLTEXT Index Service Installed

Step 7

Run Script 7 Deploy Extended Events, MUST BE RUN on SQL instance hosting your Dynamics application database, SQL 2012 and above only

Step 8

Setup Security using this blog

Step 9

Deploy the Windows Perfmon counters

Step 10 (Dynamics components)

For the Dynamics AX/365 components use the following link: http://blogs.msdn.com/b/axinthefield/archive/2015/12/30/dynamicsperf-2-0-installation-for-dynamics-ax.aspx

Before you begin

Before you deploy Performance Analyzer, you must complete the following:

  1. Extract the DynamicsPerfxxx.zip file to a location to where you can browse from the database
  2. Make sure you have rights to create new databases on the database server
  3. Verify you have read access to the Dynamics Application database
  4. Verify also you have write access to the DynamicsPerf database (this database gets created as part of Performance Analyzer)
  5. Ensure you have created a local folder on the database server called \SQLTRACE to store the extended event files that get generated


STEP 1 - Create Database Objects, and Jobs

In order to use Performance Analyzer, you must first create the DynamicsPerf database, its objects, and jobs. In the following steps you will create the DynamicsPerf database, its objects, and jobs.

  1. On the database server, open SQL Server Management Studio (SSMS)
  2. Click File>Open, Project/Solution
  3. Browse to the location for where you extracted the DynamicsPerf2.0.zip
  4. Select the Performance Analyzer 2.10 Installation.ssmssln file
  5. In Solution Explorer, open the 1-Create_Core_Objects.sql script
  6. Execute the script. [This will create the DynamicsPerf database and SQL jobs.]

image

NOTE: Ensure that you read the notes in the script if you wish to path the DynamicsPerf files to a location other than the C drive

NOTE:  If installed on SQL 2008R2, you WILL get errors.  Not all of the new views are compatible with SQL Server 2008R2.  Highly recommended to do a remote installation of DynamicsPerf 2.10 on SQL 2012 collecting data from your older SQL 2008R2 server.

99% of DynamicsPerf 2.10 works on SQL Server 2008R2.  It's just a couple new views and extended events that don't work on SQL 2008R2.

STEP 2 - Deploy DynamicsPerf Schedules

DynamicsPerf 2.10 has its own internal scheduling engine separate from SQL Server Agent. This makes it possible to have 50 different collectors without having to schedule 50 different tasks in SQL Server Agent. This also makes it much simpler to control via tables in the DynamicsPerf database.

The three main scheduling tables are:

  • DATABASES_2_COLLECT
  • DYNPERF_TASK_SCHEDULER
  • DYNPERF_TASK_HISTORY
  1. In Solution Explorer, open the 2-Deploy DynamicsPerf Schedules.sql script
  2. Execute the script. [This will deploy the tasks and schedules for the DynamicsPerf database.]

NOTE: Use script 2.0 for normal collection hourly.  Use script 2.1 to collect every 5 minutes when doing more active troubleshooting.

image

STEP 3 – Setup Linked Servers

In order to collect data from a Microsoft Dynamics product database or SSRS database that are on a different SQL Servers than the DynamicsPerf database you must setup LinkedServers for them. Otherwise, this step is optional.

NOTE: You will need to setup a Linked Server for EACH remote database server.

NOTE: MUST ENABLE Distributed Transaction Coordinator in Windows Services on the server you are setting the Linked server up on.

clip_image008

There are two ways that you could setup the Linked Server: one is using the provided script in the solution, and the other is directly in SQL Server Management Studio (SSMS).  We’ll provide steps for both.

The following is a diagram that represents the Linked Servers that would need setup for two configurations:

clip_image010

Configure Linked Server via TSQL scripts
  1. In Solution Explorer, open the 3-Setup Linked Servers(optional).sql script
  2. Edit the script.
  3. Step 1 Create the Linked Server
  4. clip_image012
  5. Configure the @server= with your remote SQL Server that you want to collect data
  6. Configure @datasrc = with the DNS name of your SQL Server. (The example is Azure SQL Database which is supported
  7. Configure @catalog with the database name of your Dynamics product database or if setting up for SSRS it should be ‘ReportServer’.
Execute this script to create the Linked Server

Step 2 Add credentials and options to this linked server

clip_image014

Configure the @server= with your remote SQL Server that you want to collect data.

If you will be using Windows Authentication, leave the script as is. If you will be setting up SQL Authentication, change @useself = ‘FALSE’, and uncomment the last 2 lines and fill in as appropriate.

NOTE: If using Windows Authentication, the SQL Server Agent MUST BE started with a DOMAIN account that has rights to the remote server.

NOTE: This account must have administrative rights to query SQL DMVs on the remote server.

Execute this script to create the Login for the Linked Server

Step 3 Enable the correct options for the Linked Server

clip_image016

Replace ‘SQL_NAME_HERE’ with the name of the Linked server setup in the previous steps

Execute this script to set the correct options for the Linked Server

Additional information about setting up linked servers can be found at: https://msdn.microsoft.com/en-us/library/ff772782.aspx

Configure Linked Server via SQL Server Management Studio (SSMS):

In SSMS, navigate to Server Objects: Linked Servers and right click and select NEW Linked Server

clip_image018

Next Click the SQL Server selection box and put in the name of your remote SQL Server:

clip_image020

Next Click the Security options and fill out as necessary:

clip_image022

Next select the Server Options Tab and set the following options:

clip_image024

click OK to setup the Linked Server

Step 4 – Configure Dynamics databases to be collected

This step populates the DATABASES_2_COLLECT table in the DynamicsPerf database.  This will start data collection immediately for the database.

NOTE: Complete this step for EACH database you would like to gather performance data on.

  1. In Solution Explorer, open the 4-ConfigureDBs to Collect.sql script
  2. Edit the script, replace the ‘SQL_NAME_HERE' with the SQL Server that hosts the database.  Replace 'DB_NAME_HERE' with the database to be collected
  3. Execute the script
  4. clip_image026

STEP 5 – Setup SQL Server Reporting Server (SSRS) data collection

This step populates the SSRS_CONFIG table in the DynamicsPerf database.  This will start data collection immediately for the SSRS database.

  1. In Solution Explorer, open the 5-Setup SSRS Data Collection.sql script
  2. Edit the script, replace the ‘SSRS_SERVER_NAME_HERE’ with the SQL Server that hosts the ReportServer database.
  3. Execute the script
  4. clip_image028

NOTE: If the SSRS is remote from the SQL Server hosting the DynamicsPerf toolset, you will need to setup a Linked Server per the steps in that section.

STEP 6 – Install the FULLTEXT indexes into DynamicsPerf

This step puts FULLTEXT indexes into the DynamicsPerf database.  This allows for more dynamic investigation of the data collected.

  1. In Solution Explorer, open the 6-Install Fulltext Indexes for DynamicsPerf.sql script
  2. Execute the script
  3. clip_image030

STEP 7 – Deploy Extended Events to SQL Server

This step deploys Extended Events to the SQL Server hosting your Dynamics application database.  This collects blocking and other performance related event data.  This is the replacement for SQL Trace

PREREQUISITE:  You must create a folder on C:\SQLTRACE of the SQL Server where you are deploying this script. If you change that folder location then you will have to edit the scripts to the new location.

  1. In Solution Explorer, open the 7-Deploy Extended Events.sql script (SQL2008R2 installations use script 7a-SQL2008R2 Blocking Jobs.sql instead of the extended events)
  2. Connect to the appropriate SQL Server instance (not the DynamicsPerf SQL Server unless this is a local install see instructions below)
  3. Execute the script
  4. clip_image032
  5. clip_image034

For SQL2008R2 installations, do the following:

  1. In Solution Explorer, open the 7a-Blocking Jobs.sql script
  2. Connect to the appropriate SQL Server instance (not the DynamicsPerf SQL Server unless this is a local install see instructions below)
  3. Execute the script
  4. Open SQL Server Agent on the instance to which you deployed the blocking job.
  5. Open the Dynperf_Default_trace_start job and enable the job. 
  6. Edit the script to change the path location if you created a director other then C:\SQLTRACE on the server.

Step 8 – Setup Security

Please follow this link for setting up the necessary security for the DynamicsPerf toolset.

https://blogs.msdn.microsoft.com/axinthefield/dynamicsperf-2-0-setting-up-security-for-remote-collection/



STEP 9 - Configure and Schedule Performance Counter Logging on Database Server

Please follow this link for information on setting up Windows Performance Monitor: http://blogs.msdn.com/b/axinthefield/archive/2016/01/05/setting-up-windows-performance-monitor-templates.aspx

Step 10 – Install Dynamics components

Please follow this link for setting up the Dynamics specific components:

https://blogs.msdn.microsoft.com/axinthefield/dynamicsperf-2-10-installation-for-dynamics-365-local-business-data-on-prem-dynamics-ax/


Deployment Verification Checklist

The following is a list of items that should be checked periodically to ensure Performance Analyzer is running and collecting the data.

Step 1 – Open the Monitor DynamicsPerf Health.sql script

We have provided a script to monitor the data collection process.  The script is called Monitor DynamicsPerf Health.sql in the installation solution.  This script will show the CAPTURE_LOG for all events.  It also displays the databases that have been setup to collect.  The 3rd dataset is the Task History table which has a column LAST_RUN which displays the last time the task was run.  You can review this data to verify data collection is setup and running without issues.

image

Step 2 – Review the CAPTURE_LOG TABLE

Check the CAPTURE_LOG in the DynamicsPerf database script

image

Step 3 - Review collected data

Open the Analyzer Performance solution and review collected data


Regards,

Rod “Hotrod” Hansen


DynamicsPerf 2.10 Installation for Dynamics 365 Local Business Data (On Prem) / Dynamics AX

$
0
0

INTRODUCTION

Please be sure to install the core components of Performance Analyzer for Microsoft Dynamics before completing this guide.

https://blogs.msdn.microsoft.com/axinthefield/dynamicsperf-2-10-deployment-guide/

The Performance Analyzer is delivered as a SQL Server solution and consists of a number of collectors as SQL jobs, X++ classes, VB scripts and performance counters. It also includes a set of sample SQL scripts that can be used to query and analyze the populated tables and views in the DynamicsPerf database.

These collectors that make up the Performance Analyzer are categorized within this document as the following:

  • Capture AOT Metadata
  • Capture AOS Settings and Event Logs

We will discuss each one of the collectors specific to Dynamics AX in the following sections and the process for deploying and maintaining Performance Analyzer in later sections.

DEPLOYING PERFORMANCE ANALYZER Dynamics AX Components

There are several steps that need to be completed in order to successfully deploy Performance Analyzer for Microsoft Dynamics AX. The Performance Analyzer is meant to be deployed and set up for data collection on a continual basis throughout the life of your Dynamics AX instance. This ensures that if performance issues arise, you are able to quickly identify the bottleneck as well as use for comparison purposes. You must have completed the steps in the Performance Analyzer for Dynamics Deployment and User Guide – Core Installation before completing the following steps.

Deployment Setup Checklist

The following is a summarized checklist of the steps to deploy Performance Analyzer. See the steps below for detailed information. These steps are continued from the https://blogs.msdn.microsoft.com/axinthefield/dynamicsperf-2-10-deployment-guide/.

Step  1

Configure and Schedule AOS Configuration and Event Logs Capture

Step  2

Configure and Schedule AOT Metadata Capture (AX)

Step  3

Enable Long Running Query Capture for AX (AX)

Step  4

Configure and Schedule Performance Counter Logging on AOS Server(s)

 

 

Before you begin

Before you deploy Performance Analyzer, you must complete the following:

  1. Extract the DynamicsPerfxxx.zip file to a location to where you can browse from the database and AOS servers
  2. Make sure you have rights to create new databases on the database server
  3. Verify you have read access to the AX database
  4. Verify also that you have write access to the DynamicsPerf database (this database gets created as part of Performance Analyzer)
  5. Ensure you have Admin permissions to each of the AOS servers connected to the AX database
  6. Ensure you have created a local folder on the database server called \SQLTRACE to store the trace files that get generated
  7. Make sure every active AOS server in the AX instance has been started with the ‘Allow client tracing on Application Object Server instance’ checkbox enabled (2009 only step)

 

STEP 1 - CAPTURE AOS SETTINGS AND EVENT LOGS

The Capture AOS Settings and Event Logs collector will capture AOS configuration and event logs from each active AOS Server in the environment.

NOTE: THIS TASK HAS BEEN GREATLY SIMPLIFIED SINCE PREVIOUS VERSIONS OF DYNAMICSPERF.

This task is now implemented as a SQL Server Agent Job.

  1. Open SQL Server Management Studio on the Server with DynamicsPerf installed
  2. Navigate to SQL Server Agent | Jobs
  3. clip_image014
  4. Right click DYNPERF_COLLECT_AOS_CONFIG job and select properties
  5. Click on Steps on left side of the screen and then click the EDIT button at the bottom middle of the screen
  6. clip_image016
  7. Change the first 4 lines of code to the appropriate SQL Server and Database names:
  8. clip_image018
  9. Click Ok twice

Test the job

  1. Right click DYNPERF_COLLECT_AOS_CONFIG job and select Start Job at Step
  2. Verify the job runs

 

STEP 2 - Configure and Schedule AOT Metadata Capture

To be able to review the table and index properties settings from within the AOT for AX tables, you will configure and schedule the AOT metadata capture. In the following steps you will configure and schedule the AOT metadata capture. The data collected will be stored in the DynamicsPerfdatabase.

NOTE: This new version does not create any tables in your Dynamics AX database any longer.  It will export directly to the DynamicsPerf database, this collector is initiated through the AOTExport X++ Class.

  1. Launch an Dynamics AX client
  2. Open the Application Object Tree (AOT) in Dynamics AX (Ctl-Shift-W to open a new developer workspace)
  3. Click the Import icon
  4. Browse to the dynamicsperf\scripts-dynamics ax\PrivateProject_AOTExport2012_DynamicsPerfDirect.xpo file found where you extracted the files from in step 1 of the “Before you begin” section
  5. Click OK to import
  6. Navigate to Classes in the AOT
  7. Select the AOTEXPORT2012Direct class (will be 2009 for that version)
  8. clip_image004
  9. Right click the class and select OPEN
  10. To run now, fill out the dialogue correctly and press OK
  11. clip_image006
  12. To set it up as a batch job, click the Batch tab and complete the dialogue as desired
  13. clip_image008

 

 

STEP 3 - Enable Long Running Query Capture for AX

If using Dynamics AX, you can set thresholds which capture long running queries with Dynamics AX source code. In the following steps you will configure the system to capture long running queries. The data collected will be stored in the DynamicsPerfdatabase.

NOTE: This enables long duration tracing for all AX users by updating the USERINFO table and sets the long running query threshold to 5000ms (5 seconds). if using a version of Dynamics AX prior to version AX2012 the ‘Allow client tracing on Application Object Service instance’ checkbox on the AOS Server Configuration Utility for each AOS Server must be marked before executing this stored procedure and it requires a restart of the AOS.

  1. On the database server, open SQL Server Management Studio (SSMS)
  2. Click File>Open, Project/Solution
  3. Browse to the location for where you extracted the DynamicsPerf2.0zip
  4. Select the Performance Analyzer 2.00 Analyze Data.ssmsln file
  5. In Solution Explorer, open the Dynamics AX Client Tracing.sql script
  6. Change xxxxxxx to the name of your AX database
  7. Execute only the part listed below from the script against the DynamicsPerf database to enable client tracing for all AX users

/****************** Set AX Client tracing *************/

/* NOTE: must enable AX client tracing on the AOS servers */

USE DynamicsPerf

GO

EXEC SET_AX_SQLTRACE

@DATABASE_NAME = 'xxxxxxxxx',

@QUERY_TIME_LIMIT = 5000

8. To view the results of a user within the application:

a. Open Dynamics AX

b. Go to Tools > Options

c. Select the SQL tab

d. Notice the SQL checkbox is marked, the long query threshold is 5000, and the Table (database) checkbox is enabled

 

 

STEP 4 - Configure and Schedule Performance Counter Logging on AOS Server(s)

To log valuable information about your AOS servers such as cpu, memory, etc., it is important to configure and schedule the performance counter logging.

Please follow the steps in this article for setting up Windows Performance Counters:

http://blogs.msdn.com/b/axinthefield/archive/2016/01/05/setting-up-windows-performance-monitor-templates.aspx

OTHER COMMANDS AND PROCEDURES

This section describes other commands and processes that can be used with the Performance Analyzer.

Disable Long Running Query Capture for AX

To disable the long running query capture for AX, follow these steps”

  1. On the database server, open SQL Server Management Studio (SSMS)
  2. Click File>Open, Project/Solution
  3. Browse to the location for where you extracted the DynamicsPerf2.0.zip
  4. Select the Performance Analyzer 2.0 for Microsoft Dynamics.ssmssln file
  5. In Solution Explorer, open the Dynamics AX Client Tracing.sql script
  6. Change <dbname> to the name of your AX database
  7. Execute only the part listed below from the script against the DynamicsPerf database to enable client tracing for all AX users
  1. /****************** Set AX Client tracing *************/
  2. /* NOTE: must enable AX client tracing on the AOS servers */
  3. USE DynamicsPerf
  4. GO
  5. EXEC SET_AX_SQLTRACE
  6. @DATABASE_NAME = '<dbname>',
  7. @QUERY_TIME_LIMIT = 5000
  • To view the results of a user within AX:
    1. Open Dynamics AX
    2. Go to Tools>Options
    3. Select the SQL tab
    4. Notice the SQL checkbox is unmarked, the long query threshold is blank, and the Table (database) checkbox is disabled

    Regards,

    Rod “Hotrod” Hansen

Electronic Reporting: Import of GL Excel Journals (Part 3)

$
0
0
This third and last post on GL journal imports with the help of Electronic Reporting focuses on the import of so-called split postings where one debit account and multiple credit accounts (or vice versa...(read more)

Desktop Icons Suddenly Small

$
0
0

WindowsI’m posting this one because it is quite silly and I’m probably not going to remember it, if I don’t write it down.

I logged onto my main desktop the other day and all of the desktop icons were a lot smaller than usual:

Desktop with small icons

I had no idea how this had happened; I have syncing set up with OneDrive and had recently installed and configured an old laptop with a 1366×768 display and my main desktop is 2560×1440 and my initial thought was it was down to this mismatch. However, I couldn’t find anything which looked like it was the cause of the problem.

I resorted to an online search, but didn’t immediately find anything. I did stumble onto the resolution and this was probably what cause my problem initially.

On one web page, I zoomed in using Ctrl+mouse scroll wheel, but didn’t have my mouse over the browser window, but over my desktop. The icons on my desktop became even smaller; reversing the direction of scroll the icons became bigger again:

Desktop with normal icons

I’ve no idea if you’ve always been able to zoom the desktop or if it is new, but now I know, I can be more careful when zooming in on a website. and make sure the mouse cursor is over the browser.

Read original post Desktop Icons Suddenly Small at azurecurve|Ramblings of a Dynamics GP Consultant; post written by Ian Grieve (Lead ERP Consultant at ISC Software Solutions)

Integration With Dynamics 365 CE And Companies House API – Azure Logic App

$
0
0

Introduction and Background

Before I start the overview of this process, I think it’s worth noting that over the last few years, the options for integration, and to be fair, any sort of extension and development in Dynamics 365 has/is shifting from being a C# development only domain, or at least a coding exercise, to giving us the option to do this via an extensive toolset – Azure, Office 365 to mention a few.

So for this example – Integration of data from an external data source – I will be going through the process in what I consider the easiest and lowest overhead in terms of coding. Yes there are a number of different options for achieving this, and coding something is one of those, but my favourite expression at the moment is “Don’t crack a walnut with a sledgehammer”.

So a couple of assumptions in terms of your setup. You’ll need:

  • An Azure Account
  • Dynamics 365 Instance
  • A perfunctory understanding of API’s and JSON

The use case for this integration is to call out to the Companies House API and pull back some company data in to Dynamics 365. There is a lot of information stored there about Ltd Companies in the UK, and could be really useful.

The Companies House API is still in Beta……to be fair it’s been Beta for a couple of years, but seems fairly stable, and there is a wealth of documentation and a forum. You will need to create an account on the site in order to access and use the API’s.

You can do this, and access the documentation and Forum here: https://developer.companieshouse.gov.uk/api/docs/

The process is going to be something like this:

So one of the first things you’ll need to do once you have created an account on the Companies House website is register an application. This process then assigns you an API Key used for authentication. There a few options for type of API, and restriction of IP addresses etc, but for this example we’ll go for a REST API. Once this is completed, you’ll get an API key.

The next step is to setup you Azure Logic App. I’ll skip the bits around setting up Resource Groups etc. CLick on Create Resource, Integration then Logic App:

Add the details:

Once this is all done, you can start to build out your logic. When your resource is ready you can navigate to it. If you’re familiar with Microsoft Flow, you might recognise some similarities.

There’s a reason why Logic Apps and Flow look similar. Flow is built on top of Logic Apps. You can read more details about the similarities and differences here

Click on the Blank Logic App and add the trigger for Dynamics 365. In this example I am going to trigger the integration when a new Account is created, but it could also be when a record is updated and so on.

So the next stage is to use the HTTP connector set it up to point to the Companies House API. For authentication use Raw, and you’ll only need the value (no need for username and password)

You’ll also notice that the URI is appended with a Dynamics Field, in this case the Account Number field which needs to store the Company number, as this is the format for searching for the company. Obviously that means you need to know the company number!

Once the HTTP is set, you need to Parse the resulting JSON. For this you need the Parse JSON Operation. Here you will set the Schema for the returning JSON.

For the GET Company Schema use the following:

{
    "properties": {
        "accounts": {
            "properties": {
                "accounting_reference_date": {
                    "properties": {
                        "day": {
                            "type": "string"
                        },
                        "month": {
                            "type": "string"
                        }
                    },
                    "type": "object"
                },
                "last_accounts": {
                    "properties": {
                        "made_up_to": {
                            "type": "string"
                        },
                        "period_end_on": {
                            "type": "string"
                        },
                        "period_start_on": {
                            "type": "string"
                        },
                        "type": {
                            "type": "string"
                        }
                    },
                    "type": "object"
                },
                "next_accounts": {
                    "properties": {
                        "due_on": {
                            "type": "string"
                        },
                        "overdue": {
                            "type": "boolean"
                        },
                        "period_end_on": {
                            "type": "string"
                        },
                        "period_start_on": {
                            "type": "string"
                        }
                    },
                    "type": "object"
                },
                "next_due": {
                    "type": "string"
                },
                "next_made_up_to": {
                    "type": "string"
                },
                "overdue": {
                    "type": "boolean"
                }
            },
            "type": "object"
        },
        "can_file": {
            "type": "boolean"
        },
        "company_name": {
            "type": "string"
        },
        "company_number": {
            "type": "string"
        },
        "company_status": {
            "type": "string"
        },
        "confirmation_statement": {
            "properties": {
                "last_made_up_to": {
                    "type": "string"
                },
                "next_due": {
                    "type": "string"
                },
                "next_made_up_to": {
                    "type": "string"
                },
                "overdue": {
                    "type": "boolean"
                }
            },
            "type": "object"
        },
        "date_of_creation": {
            "type": "string"
        },
        "etag": {
            "type": "string"
        },
        "has_been_liquidated": {
            "type": "boolean"
        },
        "has_charges": {
            "type": "boolean"
        },
        "has_insolvency_history": {
            "type": "boolean"
        },
        "jurisdiction": {
            "type": "string"
        },
        "last_full_members_list_date": {
            "type": "string"
        },
        "links": {
            "properties": {
                "filing_history": {
                    "type": "string"
                },
                "officers": {
                    "type": "string"
                },
                "persons_with_significant_control": {
                    "type": "string"
                },
                "self": {
                    "type": "string"
                }
            },
            "type": "object"
        },
        "registered_office_address": {
            "properties": {
                "address_line_1": {
                    "type": "string"
                },
                "address_line_2": {
                    "type": "string"
                },
                "locality": {
                    "type": "string"
                },
                "postal_code": {
                    "type": "string"
                }
            },
            "type": "object"
        },
        "registered_office_is_in_dispute": {
            "type": "boolean"
        },
        "sic_codes": {
            "items": {
                "type": "string"
            },
            "type": "array"
        },
        "status": {
            "type": "string"
        },
        "type": {
            "type": "string"
        },
        "undeliverable_registered_office_address": {
            "type": "boolean"
        }
    },
    "type": "object"
}

If all goes well, when you add the final Operation – Update Record – You should see all the returned data options:

Obviously the fields need to be available on the Dynamics 365 Accounts form.

Then….. Time to test!

Add a new Account in Dynamics and include the Company Number:

Once you save the record, the call is made to the Companies House APi, and returns any details it finds:

So this is very much a click and configure integration, but I think a nice use of the Logic App solution, and an easy integration with REST API’s.

Changing Password Managers

$
0
0

Back in October last year, I posted a couple of articles around my recommendations and use of Two-Factor authentication. (Here’s part 1 and part 2 for those articles, if you’re interested in reading them.)

In my part 2 post, I indicated that I was also using a password manager and was using LastPass at the time. I have recently switched to 1Password and after posting a couple of things about that on Twitter, a few friends and followers expressed interest in knowing more about why I switched and how I would compare the two of them.

Brief background

I used Roboform for many years, but started finding it feeling a bit clunky. Yes, I realize that is completely useless as any viable feedback about a product, but that’s how I can phrase it. I just found the UI “tired” and was looking for something new.

So, about a year ago, I made the switch to LastPass. I wish I could remember more of my reasons for changing but I can’t any longer, they’re gone from my memory! LastPass appeared to have some features that Roboform didn’t (at the time, at least) so that’s what I chose. It’s quite likely that what I liked about LastPass is in the Roboform product today for all I know, as the “good” password manager products out there do keep adding new features to their products. I do consider all 3 products I’ve used to be good products, I just now prefer 1Password.

Why did I switch?

When I switched to LastPass, I looked at 1Password too at that time and, to be honest, I liked that LastPass had an “always free” version where 1Password only has a 30-day trial (after which it was a paid product). Fast forward to present day, and I’ve learned more about infosec generally in the last year (out of interest) and decided to make a change.

My (somewhat lame) first reason

The more I read about data breaches and infosec news, the more I am subscribing to the theory of “if you’re not paying for the product, YOU are the product”. This isn’t the “real” reason I changed, as I could have simply upgraded to a premium version of LastPass. It was just one of the things that made me re-think my decision as cost was a factor a year ago when I don’t think it should have been.

The key reason: integration with Have I Been Pwned

LastPass has a “security center” feature which, among other things, checks if any of your logins were compromised (in a data breach). 1Password takes this to the next level, in my opinion. It integrates with Have I Been Pwned (HIBP for short, Troy Hunt’s service around data breaches) in its Watchtower feature, in two ways:

  1. Checking your logins against the HIBP service to see if they’ve been in a breach. LastPass does the same thing, although I’m unsure if it uses HIBP.
  2. Checking your passwords against the HIBP’s Pawned Passwords service to see if they have been found in a breach.

I was specifically interested in the second feature above. I already subscribe to HIBP’s feature that will send you an email when one of your accounts is in a data breach once they become aware of it. I typically use only one email address that is only for logins, not for friends, family or other correspondence. Oddly enough, in the 15-20 years I’ve used this as a login, it’s only been in one known breach and that was the most recent one called “Collection #1”. That’s simply good luck, nothing I can claim I’m doing to avoid that situation. My business email was in the LinkedIn data breach years ago, which is an example of a site where the random login email account wouldn’t have been appropriate for business communications, and I’ve paid for that by it being in a breach and is on lists in god-knows-where on the deep dark web.

I have taken the time to ensure that nearly every site I use has its own unique, randomly-generated password so what I’m really interested in watching for now are instances where those passwords have been in a data breach. I’ll describe more below about this this works as it sounds like “their sending your passwords over the internet?”. No, they’re not. Now that I’m confident most of my logins are adequately secure and complex, if any of those appear in a data breach, I will know exactly what site it might have come from as my sites:passwords are a 1:1 relationship. If one of those passwords is in a breach, it doesn’t necessarily mean MY account was in the breach (others may have the same randomly-generated password), but the odds are it’s mine and I can go to that site and change my password.

Troy Hunt and HIBP

I have followed Troy Hunt on Twitter since I became a Microsoft MVP (when I first became aware of him). Troy is a fellow Microsoft MVP as well as a Microsoft Regional Director (more about the RD program here– think “Super MVP”!). He is the creator of the incredibly useful Have I Been Pwned service.

After listening to him speak as well as getting a chance to meet him last year at MVP Summit, I have taken an even more keen interest in infosec. As I have listened to his weekly podcasts over the past year, I realized I still have a lot of learn about it.

His HIBP service is a free service, which originally started out with checking if your email account was in a data breach. More about the site can be found here, in the About page, as well as looking in the FAQ page. Here’s a clip of the stats from the landing page about what’s in his databases at the moment.

The key take-away that is important to note here is the pawned logins are not stored along with the pawned passwords. If your email is in a data breach, there is no service where he tells you which password they used in that breach. Lots of people argue with him over that but the reasons are clear, and he’s blogged about it here.

For the Pawned Passwords feature, that’s interesting to me on a level where I don’t entirely understand the technical details but understand the concept. In 1Password, the passwords are hashed (not stored in plain text for anyone to read/see) and the service to check against the HIBP passwords feature sends the first 5 characters of the hash to HIBP’s API and it returns all of the hashes that match that prefix. 1Password then compares my hash to the list returned to see if I’m in it. No passwords are actually passed back and forth. Part of the hash is sent back and forth but not the whole hash so only the sender/receiver know the full hash to be able to know the full password at any given time.

His example is this: let’s say you have “P@ssw0rd” in your list. The hashed value of that is “21BD12DC183F740EE76F27B78EB39C8AD972A757”. 1Password sends “21BD1” and HIBP returns (at the time he wrote this example) 475 results, for the items that start with “21BD1”. If “2DC183F740EE76F27B78EB39C8AD972A757” is in that list, then the password is flagged as compromised. More details around this example are on this post.

Comparisons

Things I prefer in 1Password

It’s a bit too early to do a LOT of comparing but already I’m seeing things I like a lot better in 1Password than LastPass. Here’s the Watchtower “menu”:

To be fair, both products have features around re-used passwords, weak passwords and compromised logins. I described above why I was interested in the vulnerable passwords feature. I like the bottom 3 features as well, which I don’t recall an equivalent of in LastPass: insecure websites (HTTP not HTTPS), sites which have 2FA options that I’m not using and expiring passwords. I’ve already used the unsecured websites feature to update all of my URLs if the site was using HTTPS. Most were, but the old password manager happened to just have whatever site I saved at the time, many were the HTTP version.

This (above) is what appears on the Unsecured Website area or if I’m on a site where my login is in that list. It will actually not let me auto-fill that login or password (or so it seems). That I find a tad annoying. I’m fine being alerted to the issue but if the site isn’t secure and there is no option, I would prefer if it auto-filled my login and password for me anyway.

This is a snapshot of one of my vaults. Today the Unsecured Websites number is actually 2, which are both things I can’t change. I have several sites where I can enable 2FA, so that’s my next target. I don’t have any compromised logins, weak passwords or vulnerable passwords so that’s a good thing.

Another thing I noticed that I like is a choice of type of random password generation, other than the typical length and whether to include numbers or symbols. 1Password gives you a “memorable password” option which sounds dumb on the surface, but it’s helping creating passphrases that will be better and easier to remember while still be lengthy enough to be a security improvement.

Lastly, I like that you can have different “vaults” – groups of saved logins etc. I can segregate ones for work from personal ones and have actually created a vault for sites I no longer use, but want to keep the details in case of future data breaches. That keeps my active logins list a lot smaller and more manageable.

Similar features implemented in different ways

Various sites I subscribe to have different URLs but the same login and password. Amazon.com and .ca for instance. GPUG.com and all of the other UGs are another example. The way these work in LastPass is via an “Equivalent Domains” feature which is a place where you can say “Amazon.com and Amazon.ca are equivalent domains”. That allows you to have 1 login saved, and thus avoids the “you have 2 logins with the same password” problem.

In 1Password it appears that on a login, you can list different URLs which is a different way of implementing the same feature. I quite like this because you can also label it if you need to, other than each being called “website”. I didn’t, in the example above, but I could label them Canada, USA or whatever makes sense for the different sites using 1 login/password combination. In this case, they are linked so you can’t have different logins for the sites. If I have a choice not to use the same login & password combination, I would rather do that and separate the logins. With some sites, they are related but utilize the same authentication in the back-end so you don’t have that choice.

One LastPass feature I’ll miss

The one feature I’ll miss that I can’t find on 1Password was something I don’t know the name of but I could lock some passwords in and require that I re-type my password if I used them. I put that on my banking sites, anything work-related, and anything with financial info saved (sites with credit cards you can’t remove) and things like that. If that’s available in 1Password, I’ve yet to find it. It was a nice 2nd layer to force me to authenticate a 2nd time before using that particular login.

Last thoughts

That’s pretty much it for this post. I think what I may do for a #TipTuesday is a write a brief primer on how to get started with using a password manager if you don’t already have one. There are lots of people who don’t use one and use far less secure ways of tracking their passwords, either in an unsecured note on their phones or computers, or even worse, re-use passwords to reduce what they have to remember. A password manager and working through your logins to re-generate passwords to be more complex is a much safer, better alternative!

Viewing all 50850 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>