Skip to content

Commit bd9bc55

Browse files
authored
Migrate to Azure Functions Elastic Premium plan (#2893)
This pull request primarily removes the `parallel-workers` package from the repository and cleans up related code and configuration. Additionally, it refines error reporting in Azure credential management and streamlines hash generation logic by removing unused methods. Below are the most important changes grouped by theme: **Parallel-workers package removal:** * Deleted the entire `packages/parallel-workers` directory, including all source files, configuration files, and tests. [[1]](diffhunk://#diff-25989936846266e0f9905c0127502489515f0b137a328dfed9e31a561ccd819aL1-L9) [[2]](diffhunk://#diff-a8b0eb3771036e137c69d2f3b9750b077d5728f50fce4ff02c6958ec681c2536L1-L46) [[3]](diffhunk://#diff-4d638471a511daeea3e736674bb59d158210b45bcd822c5b2db42d89bc402d04L1-L8) [[4]](diffhunk://#diff-7a556c7cebb73de26cd321fc591f315d0e943369427696919f21e6f8364a09bdL1-L4) [[5]](diffhunk://#diff-1f31bb2f012812594874b44d5714037998a44d9550b7f3a268b68823d6331d22L1-L83) [[6]](diffhunk://#diff-1c4cbe11be8ac467da7364d7c95a23bd239968517396f72b30c896eee9319bd6L1-L55) [[7]](diffhunk://#diff-d0a67630bdf2a3aeed9ebecef997065069baf7358d643bf593c93223380947a9L1-L6) * Removed references to `parallel-workers` from workspace configuration files, build scripts, and package lists (`.vscode/workspace.code-workspace`, `package.json`, and related test/build configs). [[1]](diffhunk://#diff-add87b7a14c2f7b31686c392cab5f20ed6f1245e2bb675aa78eb845d1a60102bL51-L54) [[2]](diffhunk://#diff-8314470685c1f819d496e2aac5f58b3230753af65a0c87b3a4456ccab50b6a13L21) [[3]](diffhunk://#diff-54e9026b0beaa1299fa79d49a2fade8542db7613c4de23abf2dba289eefba6e2L68-L72) **Hash generator simplification:** * Removed unused hash generation methods (`getWebsiteScanResultDocumentId`, `getWebsiteScanResultPartDocumentId`, and related test cases) from `packages/common/src/ciphers/hash-generator.ts` and `packages/crawler/src/common/hash-generator.ts`. [[1]](diffhunk://#diff-c2186d9bc818cf91de90837f18062015a2f4fe0517329f58e3c205249d9efd42L20-L34) [[2]](diffhunk://#diff-0e904fddbcaeb6ae5e54fe80fea0ecb7a60b7bdde81ec262914ece863e06ecc3L45-L65) [[3]](diffhunk://#diff-5de80d0364ae76197deaddbe3f3bf6f1d6be292cdf5b96cbacb5c288dc8fda71L26-L51) [[4]](diffhunk://#diff-5ad86124b1e8180a5c1aaa5523220fda4ee6ccefd80345f3f98a19b352803492L15-L38) * Cleaned up imports in `packages/crawler/src/common/hash-generator.ts` by removing unused dependencies. **Azure credential error reporting:** * Improved error messages in `IdentityCredentialCache` to provide more context, including client ID, scope, and stack trace. Updated corresponding test to match the new error message. [[1]](diffhunk://#diff-a52017d64031aaf6e5a3c418ac0fffbc4ba982de6c03cf1d734ee0fb69cabd38L44-R48) [[2]](diffhunk://#diff-7a27d7be70cbac5e5d29d16077b21535e3b68a2e76918745ab6452bc7226134bL122-R122) **OpenAPI contract adjustment:** * Updated `packages/api-contracts/openapi.json` to remove the `scanType` property from one schema and restrict its enum to `"accessibility"` in another, reflecting a change in supported scan types. [[1]](diffhunk://#diff-d0825d6cd1593677824dda2a6df69779218ebfe33c83a4aa53560c2542743efbL223-L228) [[2]](diffhunk://#diff-d0825d6cd1593677824dda2a6df69779218ebfe33c83a4aa53560c2542743efbR540-R545) **Build and ignore file updates:** * Broadened the ignore pattern for `.txt` files in `.prettierignore` and adjusted the `clean` script in `package.json` to remove fewer directories. [[1]](diffhunk://#diff-b640b344ee7f3f03d2a443795a5d0708ef50e2e6e34214109ab2aad13ad6ba98L30-R30) [[2]](diffhunk://#diff-7ae45ad102eab3b6d7e7896acd08c427a9b25b346470d7bc6507b6481575d519L11-R11)
1 parent 8814fa7 commit bd9bc55

File tree

69 files changed

+359
-2151
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

69 files changed

+359
-2151
lines changed

.prettierignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ dist/
2727
**/.env.*
2828
**/yarn-*.log
2929
**/*.ts.snap
30-
copyright-header.txt
30+
**/*.txt
3131
**/.DS_Store
3232
**/.funcignore
3333
**/extensions.csproj

.vscode/workspace.code-workspace

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -48,10 +48,6 @@
4848
"name": "logger",
4949
"path": "../packages/logger"
5050
},
51-
{
52-
"name": "parallel-workers",
53-
"path": "../packages/parallel-workers"
54-
},
5551
{
5652
"name": "privacy-scan-core",
5753
"path": "../packages/privacy-scan-core"

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
},
99
"scripts": {
1010
"packages": "yarn workspaces foreach --no-private --parallel --verbose --all",
11-
"clean": "yarn packages run clean && rimraf dist test-results ./**/coverage",
11+
"clean": "yarn packages run clean && rimraf dist test-results",
1212
"build": "yarn packages --topological-dev run build",
1313
"cbuild": "yarn packages --topological-dev run cbuild",
1414
"dbuild": "yarn install && npm-run-all --serial syncpack:fix build",

packages/api-contracts/openapi.json

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -220,12 +220,6 @@
220220
"type": "integer",
221221
"example": 100
222222
},
223-
"scanType": {
224-
"description": "The targeted type of the scan",
225-
"type": "string",
226-
"enum": ["accessibility", "privacy"],
227-
"example": "accessibility"
228-
},
229223
"scanNotifyUrl": {
230224
"type": "string",
231225
"example": "https://www.example.com/api/notification"
@@ -543,6 +537,12 @@
543537
"type": "string",
544538
"example": "https://accessibilityinsights.io/"
545539
},
540+
"scanType": {
541+
"description": "The type of the scan",
542+
"type": "string",
543+
"enum": ["accessibility"],
544+
"example": "accessibility"
545+
},
546546
"deepScanId": {
547547
"type": "string",
548548
"example": "d6cdd091-a985-4dfd-a046-57c7f4d23bfc"

packages/azure-services/src/credentials/identity-credential-cache.spec.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -119,7 +119,7 @@ describe(IdentityCredentialCache, () => {
119119
getAccessToken = async () => Promise.reject(new Error('msi service error'));
120120

121121
await expect(identityCredentialCache.getToken(scopes, undefined, getAccessToken)).rejects.toThrowError(
122-
/Credential provider has failed./,
122+
/Azure credential provider has failed./,
123123
);
124124
});
125125
});

packages/azure-services/src/credentials/identity-credential-cache.ts

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,11 @@ export class IdentityCredentialCache {
4141
try {
4242
return await getAccessToken();
4343
} catch (error) {
44-
throw new Error(`Credential provider has failed. ${System.serializeError(error)}`);
44+
throw new Error(
45+
`Azure credential provider has failed. ClientId: ${clientId}. Scope: ${scope}. Error: ${System.serializeError(
46+
error,
47+
)}. Stack: ${new Error().stack}`,
48+
);
4549
}
4650
});
4751

packages/common/src/ciphers/hash-generator.spec.ts

Lines changed: 0 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -42,27 +42,6 @@ describe('HashGenerator', () => {
4242
expect(id).toEqual(expectedId);
4343
});
4444

45-
it('generate getWebsiteScanPageDataDocumentId', () => {
46-
hashGenerator = new HashGenerator(SHA);
47-
const id = hashGenerator.getWebsiteScanPageDataDocumentId('websiteId', 'scanId');
48-
const expectedId = hashGenerator.generateBase64Hash('websiteId', 'scanId');
49-
expect(id).toEqual(expectedId);
50-
});
51-
52-
it('generate WebsiteScanResultDocumentId', () => {
53-
hashGenerator = new HashGenerator(SHA);
54-
const id = hashGenerator.getWebsiteScanResultDocumentId('baseUrl', 'scanGroupId');
55-
const expectedId = hashGenerator.generateBase64Hash('baseUrl', 'scanGroupId');
56-
expect(id).toEqual(expectedId);
57-
});
58-
59-
it('generate WebsiteScanResultPartDocumentId', () => {
60-
hashGenerator = new HashGenerator(SHA);
61-
const id = hashGenerator.getWebsiteScanResultPartDocumentId('baseId', 'scanId');
62-
const expectedId = hashGenerator.generateBase64Hash('baseId', 'scanId');
63-
expect(id).toEqual(expectedId);
64-
});
65-
6645
it('should generate same hash every time without stubbing', () => {
6746
hashGenerator = new HashGenerator(SHA);
6847
const hash1 = hashGenerator.generateBase64Hash('u1', 'f1', 's1', 'r1');

packages/common/src/ciphers/hash-generator.ts

Lines changed: 0 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -17,21 +17,6 @@ export class HashGenerator {
1717
return this.generateBase64Hash(baseUrl, scanGroupId);
1818
}
1919

20-
public getWebsiteScanPageDataDocumentId(websiteId: string, scanId: string): string {
21-
// Preserve parameters order below for the hash generation compatibility
22-
return this.generateBase64Hash(websiteId, scanId);
23-
}
24-
25-
public getWebsiteScanResultDocumentId(baseUrl: string, scanGroupId: string): string {
26-
// Preserve parameters order below for the hash generation compatibility
27-
return this.generateBase64Hash(baseUrl, scanGroupId);
28-
}
29-
30-
public getWebsiteScanResultPartDocumentId(baseId: string, scanId: string): string {
31-
// Preserve parameters order below for the hash generation compatibility
32-
return this.generateBase64Hash(baseId, scanId);
33-
}
34-
3520
public getDbHashBucket(prefix: string, ...values: string[]): string {
3621
// Changing buckets count will affect bucket generation of the same values
3722
return this.getHashBucket(prefix, 1000, ...values);

packages/crawler/src/build-utilities/monorepo-packages.spec.ts

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,6 @@ describe('listMonorepoPackageNames', () => {
1818
"functional-tests",
1919
"health-client",
2020
"logger",
21-
"parallel-workers",
2221
"privacy-scan-core",
2322
"privacy-scan-job-manager",
2423
"privacy-scan-runner",

packages/crawler/src/common/hash-generator.spec.ts

Lines changed: 0 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -23,32 +23,6 @@ describe('HashGenerator', () => {
2323
hashGenerator = new HashGenerator(shaJsMock.object);
2424
});
2525

26-
it('generate hash bucket', () => {
27-
hashGenerator = new HashGenerator(SHA);
28-
const bucket = hashGenerator.getHashBucket('bucket', 300, 'id1', 'id2', 'id3');
29-
expect(bucket).toEqual('bucket-101');
30-
});
31-
32-
it('generate DB hash bucket with preset range', () => {
33-
hashGenerator = new HashGenerator(SHA);
34-
const bucket = hashGenerator.getDbHashBucket('bucket', 'id1', 'id2', 'id3');
35-
expect(bucket).toEqual('bucket-425');
36-
});
37-
38-
it('generate WebsiteScanResultDocumentId', () => {
39-
hashGenerator = new HashGenerator(SHA);
40-
const id = hashGenerator.getWebsiteScanResultDocumentId('baseUrl', 'scanGroupId');
41-
const expectedId = hashGenerator.generateBase64Hash('baseUrl', 'scanGroupId');
42-
expect(id).toEqual(expectedId);
43-
});
44-
45-
it('generate WebsiteScanResultPartDocumentId', () => {
46-
hashGenerator = new HashGenerator(SHA);
47-
const id = hashGenerator.getWebsiteScanResultPartDocumentId('baseId', 'scanId');
48-
const expectedId = hashGenerator.generateBase64Hash('baseId', 'scanId');
49-
expect(id).toEqual(expectedId);
50-
});
51-
5226
it('should generate same hash every time without stubbing', () => {
5327
hashGenerator = new HashGenerator(SHA);
5428
const hash1 = hashGenerator.generateBase64Hash('u1', 'f1', 's1', 'r1');

0 commit comments

Comments
 (0)