Skip to main content

Function Code Sample

Below are several usage samples for Function-based Cloud Hosting. For simplicity, these samples are primarily written in JavaScript.

Importing External Modules

In functions, import external modules such as lodash and use their provided methods:

Function Code File Sample:

// index.js
const _ = require('lodash')

exports.main = function(event, context) {
return _.kebabCase('Hello world')
}

package.json file:

// package.json
{
"name": "example",
"version": "1.0.0",
"main": "index.js",
"dependencies": {
"lodash": "^4.17.21"
}
}

After the function is executed, the client can receive a hello-world response.

Sharing Code Between Different Functions

By leveraging single-instance multi-function capability and inter-function routing, you can use common module import methods to share public modules between different functions.

Suppose there are functions funcA and funcB that need to share a method now for getting the current time, with the following directory structure:

.
├── cloudbase-functions.json # Multiple functions configuration file
├── cloudrunfunctions/common # Common methods directory
│ └── time.js # Common methods
├── cloudrunfunctions/funcA # Function A directory
│ └── index.js
└── cloudrunfunctions/funcB # Function B directory
├── package.json # Specifies index.mjs as the entry file
└── index.mjs

Export the now method in time.js:

// common/time.js
exports.now = function () {
return new Date().toLocaleString()
}

In the code of function A and function B, you can directly import it:

// cloudrunfunctions/funcA/index.js
const now = require('../common/time').now
// cloudrunfunctions/funcB/index.mjs
import { now } from '../common/time.js'

In cloudbase-functions.json, declare different functions:

{
"functionsRoot":"./cloudrunfunctions/", // Functions root directory, specified as the current directory
"functions": [ // Declare each function and its entry file
{
"name":"funcA", // Declare function A
"directory":"funcA",
"triggerPath": "/a"
},
{
"name":"funcB", // Declare function B
"directory":"funcB",
"triggerPath": "/b"
}
]
}

In this way, function A and function B will be deployed in the same service, and both can use the now method

Routing in Functions

Based on the HTTP-related path, query, and other information obtained from the context, simple routing functionality can be implemented.

Function code:

exports.main = function(event, context) {
const { httpContext } = context
const { url } = httpContext
const path = new URL(url).pathname

// Return different content based on the access path
switch (path) {
case '/':
return {
statusCode: 200,
body: 'Hello world!'
}
case '/index.html':
return {
statusCode: 200,
headers: {
'Content-Type': 'text/html'
},
body: '<h1>Hello world!</h1>'
}
default:
return {
statusCode: 404,
body: 'Not found'
}
}
}

Returning Different Types of Responses

Different routing paths can return different response types and response content.

Function code:

exports.main = function(event, context) {
const { httpContext } = context
const { url } = httpContext
const path = new URL(url).pathname

// Return different content based on the access path
switch (path) {
// Return a string directly
case '/':
return 'Hello world!'
// Return current timestamp
case '/now':
return new Date().getTime()
// Return HTML using integrated response
case '/index.html':
return {
statusCode: 200,
headers: {
'Content-Type': 'text/html'
},
body: '<h1>Hello world!</h1>'
}
// Use integrated response to return JSON
default:
return {
statusCode: 404,
headers: {
'Content-Type': 'application/json'
},
body: {
message: 'Not found'
}
}
}
}

Integrated and non-integrated responses can be used in combination to achieve a richer variety of response types.

Using Server-sent Events to push messages

To adapt to the SSE protocol commonly used in AI large model APIs, function-based cloud hosting can support content pushing via SSE.

Function code:

exports.main = async function (event, context) {
// Switch to SSE mode
const sse = context.sse()

// Optional parameter to set request headers for SSE connections
// const sse = context.sse({
// keepalive: false, // Whether to maintain the connection; enabled by default, can be disabled
// headers: {
// 'Mcp-Session-ID': 'this-is-a-mcp-session-id',
// 'X-ABC': ['A', 'B', 'C'],
// }
// })

sse.on('close', () => {
console.log('sse closed')
})

// Send events to the client; check if the connection is closed before sending, and send only if it is not closed.
if (!sse.closed) {
// Send multiple events multiple times
sse.send({ data: 'No.1 message' })
sse.send({ data: 'No.2 message with\n\r\r\n\r\r\rtwo lines.' })

// Send multiple events in a single batch
sse.send([
{ data: 'No.1 message' },
{ data: 'No.2 message with\n\r\r\n\r\r\rtwo lines.' }
])

// Below is an example of sending raw messages
// This method is used to extend the SSE protocol, for example, to send other Event Field fields.
// Note: Data is sent immediately only if there is a newline character at the end
sse.send('message: This is a raw message. ')
sse.send(['message: This is another raw message.\n\n'])

// Function execution time is calculated based on the function's return time
// After the function returns, the HTTP request processing completes; asynchronous logic within the function continues execution without affecting the function's return time
// The TCP network connection remains occupied by SSE; messages can continue to be sent to the client before the SSE connection is closed by either the client or server
// The SSE protocol has converted HTTP to a long-connection mode; either the client or server must actively close the connection when appropriate, otherwise persistent connections will consume network resources
// Because the party that actively closes a TCP connection enters the TIME_WAIT state, a large number of connections in TIME_WAIT will exhaust network resources and prevent new connections from being established; therefore, having the client actively close connections better aligns with best practices
// Because the client may not know when to close the connection, the server can send a special message informing the client that the messages have ended and it can close the connection.
// In the browser, call EventSource#close to close the connection. See: https://developer.mozilla.org/en-US/docs/Web/API/EventSource/close
return ''
}
}

Using WebSocket persistent connections to send and receive messages

Function code:

exports.main = function (event, context) {
console.log({ event, context })
if (context.ws) {
context.ws.on('close', (msg) => {
console.log('close: ', msg)
})
context.ws.on('message', (msg) => {
console.log('message: ', msg)
})
setInterval(() => {
context.ws.send(`now: ${new Date().toISOString()}`)
}, 100)
}
}

// Supports both synchronous and asynchronous
exports.main.handleUpgrade = async function (upgradeContext) {
console.log(upgradeContext, 'upgradeContext')
if (upgradeContext.httpContext.url === '/upgrade-handle-throw-error') {
throw new Error('test throw error')
} else if (upgradeContext.httpContext.url === '/upgrade-handle-reject-error') {
return Promise.reject(new Error('test reject error'))
} else if (upgradeContext.httpContext.url === '/allow-websocket-false') {
return {
allowWebSocket: false,
statusCode: 403,
body: JSON.stringify({ code: 'code', message: 'message' }),
contentType: 'appliaction/json; charset=utf-8'
}
}
return { allowWebSocket: true }
}

node.js client code:

import WebSocket from 'ws'

function run () {
const ws = new WebSocket('ws://127.0.0.1:3000/')

ws.on('close', (code, reason) => {
console.log('close:', code, `${reason}`)
})
ws.on('error', (err) => {
console.error('error: ', err)
})
ws.on('upgrade', () => {
console.log('upgrade')
})
ws.on('ping', () => {
console.log('recv ping message')
})
ws.on('pong', () => {
console.log('recv pong message')
setTimeout(() => {
ws.ping()
}, 1000)
})
ws.on('unexpected-response', (ws, req, res) => {
// Non-upgrade responses and 3xx redirect responses are considered unexpected-response
console.log('recv unexpected-response message')
})

ws.on('message', (data) => {
console.log('received: %s', data)
})

ws.on('open', () => {
ws.ping()
ws.send('string data')
ws.send(Buffer.from('buffer data'))
})
}

run()

Submitting form data (files) using multipart/form-data

Function Cloud Hosting supports frontend submission of form content in multipart/form-data format, which can include file content.

Function code:

const path = require('path')
const fs = require('fs')

exports.main = async function (event, context) {
// Retrieve the file to be saved from the event.file property (corresponds to the passed parameter)
// event.file is of type PersistentFile. See: https://www.npmjs.com/package/formidable#file
// If there are other parameters passed, they can be obtained using the corresponding name in the form data, such as event.name, event.size, etc.
const file = event.file
// Get the original file name
const fileName = file.originalFilename;
// Directory for file storage
const fileDir = path.join(process.cwd(), 'tmp')
if (!fs.existsSync(fileDir)) {
fs.mkdirSync(fileDir)
}
const filePath = path.join(fileDir, fileName)
// Read the file stream from the parameters
const readStream = fs.createReadStream(file.filepath)
// Attempt to save the file to the specified directory
try {
await fs.promises.writeFile(filePath, readStream)
} catch (error) {
return {
statusCode: 500,
body: `Error saving file: ${error.message}`
}
}

// Note: Delete temporary files

return {
statusCode: 200,
body: `File saved to: ${filePath}`
}
}

Send a file upload request:

curl --location 'url' \
--form 'file=@file.png'

Note:

  1. After the file upload is completed, it will be saved to a local file. You should delete the file after function execution to avoid excessive disk space usage.
  2. If persistent storage is required for uploaded files, they should be saved to cloud storage or other services to prevent file loss, rather than being stored locally.

Using PUT to upload binary data or files

Function code:

const path = require('path')
const fs = require('fs')

exports.main = async function(event, context) {
const { httpContext } = context
const { url } = httpContext
// Get the file name from the query
const filename = new URL(url).searchParams.get('filename')
// Directory for file storage
const fileDir = path.join(process.cwd(), 'tmp')
if (!fs.existsSync(fileDir)) {
fs.mkdirSync(fileDir)
}
// File storage path
const filePath = path.join(fileDir, filename)

try {
// Get the file content from the event
const buffer = Buffer.from(event, 'binary')
await fs.promises.writeFile(filePath, buffer);
return {
statusCode: 200,
body: `File saved to: ${filePath}`,
};
} catch (error) {
return {
statusCode: 500,
body: `Error saving file: ${error.message}`,
};
}
}

Send a file upload request:

curl --location --request PUT 'url?filename=file.png' \
--header 'Content-Type: application/octet-stream' \
--data 'file.png'

Using puppeteer for webpage screenshots

{
"name": "example",
"version": "1.0.0",
"main": "index.mjs",
"type": "module",
"dependencies": {
"puppeteer": "^23.11.1"
}
}
import * as path from 'path'
import * as url from 'url'
import * as fs from 'fs'
import puppeteer from 'puppeteer'

const __filename = url.fileURLToPath(import.meta.url)
const __dirname = path.dirname(__filename)

async function screenshotWebPage(webPageUrl, savePath) {
const browser = await puppeteer.launch({
args: ['--no-sandbox', '--disable-setuid-sandbox']
})
const page = await browser.newPage()

await page.goto(webPageUrl)
await page.pdf({
path: savePath,
format: 'letter',
})

await browser.close()
}

export const main = async function (event, context) {
const webPageUrl = 'https://docs.cloudbase.net/cbrf/intro'
const savePath = 'cbrf-intro.pdf'
await screenshotWebPage(webPageUrl, savePath)

return {
statusCode: 200,
headers: {
'Content-Disposition': `attachment; filename=${savePath}`,
'Content-Type': 'application/pdf',
},
body: fs.createReadStream(path.join(__dirname, savePath))
}
}