The China Mail - OpenAI forms AI safety committee after key departures

USD -
AED 3.672915
AFN 69.498806
ALL 83.650152
AMD 383.512449
ANG 1.789783
AOA 917.000101
ARS 1298.494503
AUD 1.537409
AWG 1.8015
AZN 1.699267
BAM 1.672875
BBD 2.019801
BDT 121.54389
BGN 1.67706
BHD 0.377064
BIF 2955
BMD 1
BND 1.2813
BOB 6.912007
BRL 5.414199
BSD 1.000321
BTN 87.544103
BWP 13.368973
BYN 3.323768
BYR 19600
BZD 2.009452
CAD 1.380225
CDF 2890.00033
CHF 0.806198
CLF 0.024611
CLP 965.497436
CNY 7.18025
CNH 7.18474
COP 4049
CRC 505.848391
CUC 1
CUP 26.5
CVE 95.150485
CZK 20.991402
DJF 177.720325
DKK 6.394301
DOP 61.697773
DZD 129.760941
EGP 48.303804
ERN 15
ETB 140.398806
EUR 0.85673
FJD 2.256398
FKP 0.739045
GBP 0.737885
GEL 2.694977
GGP 0.739045
GHS 10.649964
GIP 0.739045
GMD 72.504736
GNF 8675.000195
GTQ 7.67326
GYD 209.282931
HKD 7.825405
HNL 26.349985
HRK 6.458304
HTG 130.995403
HUF 338.740139
IDR 16161.25
ILS 3.37285
IMP 0.739045
INR 87.624601
IQD 1310
IRR 42124.999797
ISK 122.689824
JEP 0.739045
JMD 160.068427
JOD 0.709003
JPY 147.076985
KES 129.200637
KGS 87.378798
KHR 4006.999989
KMF 422.499154
KPW 899.956741
KRW 1388.50503
KWD 0.30554
KYD 0.833615
KZT 538.462525
LAK 21600.000073
LBP 89534.569506
LKR 301.105528
LRD 201.497933
LSL 17.610147
LTL 2.95274
LVL 0.60489
LYD 5.425006
MAD 8.998031
MDL 16.680851
MGA 4439.999963
MKD 52.814529
MMK 2099.016085
MNT 3589.3757
MOP 8.081343
MRU 39.94035
MUR 45.640025
MVR 15.397731
MWK 1736.497604
MXN 18.77915
MYR 4.219496
MZN 63.960284
NAD 17.610084
NGN 1533.139705
NIO 36.750206
NOK 10.20442
NPR 140.070566
NZD 1.68843
OMR 0.384515
PAB 1.000321
PEN 3.5625
PGK 4.146977
PHP 56.983976
PKR 282.249753
PLN 3.651041
PYG 7492.783064
QAR 3.640498
RON 4.336301
RSD 100.383993
RUB 79.850659
RWF 1444
SAR 3.752267
SBD 8.223773
SCR 14.719684
SDG 600.497554
SEK 9.56118
SGD 1.283415
SHP 0.785843
SLE 23.198083
SLL 20969.49797
SOS 571.486806
SRD 37.539712
STD 20697.981008
STN 21.4
SVC 8.75255
SYP 13001.259394
SZL 17.609858
THB 32.429503
TJS 9.318171
TMT 3.51
TND 2.88425
TOP 2.342101
TRY 40.86874
TTD 6.789693
TWD 30.013976
TZS 2624.999563
UAH 41.503372
UGX 3559.071956
UYU 40.030622
UZS 12587.503112
VES 134.31305
VND 26272.5
VUV 119.348233
WST 2.651079
XAF 561.06661
XAG 0.026302
XAU 0.000299
XCD 2.70255
XCG 1.802887
XDR 0.702337
XOF 560.000562
XPF 102.750118
YER 240.275036
ZAR 17.57475
ZMK 9001.198572
ZMW 23.033465
ZWL 321.999592
  • RBGPF

    0.0000

    73.08

    0%

  • JRI

    0.0100

    13.41

    +0.07%

  • SCS

    -0.1600

    16.2

    -0.99%

  • CMSC

    -0.0800

    23.09

    -0.35%

  • BCC

    -1.5300

    86.62

    -1.77%

  • NGG

    1.0300

    71.56

    +1.44%

  • BCE

    0.2600

    25.37

    +1.02%

  • RYCEF

    0.1200

    14.92

    +0.8%

  • RIO

    -1.0500

    62.52

    -1.68%

  • CMSD

    -0.0530

    23.657

    -0.22%

  • RELX

    -0.0800

    47.69

    -0.17%

  • VOD

    -0.0100

    11.64

    -0.09%

  • BTI

    0.3100

    57.42

    +0.54%

  • BP

    0.3300

    34.64

    +0.95%

  • AZN

    0.5300

    78.47

    +0.68%

  • GSK

    0.1000

    39.23

    +0.25%

OpenAI forms AI safety committee after key departures
OpenAI forms AI safety committee after key departures / Photo: © AFP/File

OpenAI forms AI safety committee after key departures

OpenAI, the company behind ChatGPT, announced the formation of a new safety committee on Tuesday, weeks after the departures of key executives raised questions about the firm's commitment to mitigating the dangers of artificial intelligence.

Text size:

The company said the committee, which will include CEO Sam Altman, is being established as OpenAI begins training its next AI model, expected to surpass the capabilities of the GPT-4 system powering ChatGPT.

"While we are proud to build and release industry-leading models on both capabilities and safety, we welcome a robust debate at this important juncture," OpenAI stated.

Comprised of board members and executives, the committee will spend the next 90 days comprehensively evaluating and bolstering OpenAI's processes and safeguards around advanced AI development.

OpenAI stated it will also consult outside experts during this review period, including former US cybersecurity officials Rob Joyce, who previously led efforts at the National Security Agency, and John Carlin, a former senior Justice Department official.

Over the three-month span, the group will scrutinize OpenAI's current AI safety protocols and develop recommendations for potential enhancements or additions.

After this 90-day review, the committee's findings will be presented to the full OpenAI board before being publicly released.

The committee's formation comes on the heels of recent executive departures that stoked concerns about OpenAI's AI safety priorities.

Earlier this month, the company dissolved its "superalignment" team dedicated to mitigating long-term AI risks.

In announcing his exit, team co-lead Jan Leike criticized OpenAI for prioritizing "shiny new products" over vital safety work in a series of posts on X, the platform previously known as Twitter.

"Over the past few months, my team has been sailing against the wind," Leike said.

OpenAI has also faced controversy over an AI voice some claimed closely mimicked actress Scarlett Johansson, though the company denied attempting to impersonate the Hollywood star.

K.Lam--ThChM